Mar 18 09:02:12 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 09:02:12 crc restorecon[4762]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 09:02:13 crc kubenswrapper[4778]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 09:02:13 crc kubenswrapper[4778]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 09:02:13 crc kubenswrapper[4778]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 09:02:13 crc kubenswrapper[4778]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 09:02:13 crc kubenswrapper[4778]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 09:02:13 crc kubenswrapper[4778]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.891437 4778 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896321 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896351 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896361 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896371 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896380 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896391 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896404 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896415 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896424 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896432 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896453 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896462 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896470 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896478 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896485 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896493 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896501 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896508 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896516 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896523 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896531 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896539 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896546 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896553 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896561 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896568 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896576 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896583 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896591 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896598 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896606 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896613 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896621 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896628 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896636 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896644 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896652 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896660 4778 feature_gate.go:330] unrecognized feature gate: Example Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896669 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896676 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896684 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896691 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896699 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896706 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896716 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896726 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896736 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896744 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896755 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896763 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896771 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896779 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896787 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896794 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896801 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896809 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896817 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896827 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896837 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896846 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896854 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896862 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896870 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896878 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896887 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896895 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896902 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896913 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896922 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896930 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896937 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.898936 4778 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.899038 4778 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901133 4778 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901247 4778 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901263 4778 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901274 4778 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901289 4778 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901303 4778 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901313 4778 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901322 4778 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901340 4778 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901351 4778 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901361 4778 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901370 4778 flags.go:64] FLAG: --cgroup-root="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901379 4778 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901389 4778 flags.go:64] FLAG: --client-ca-file="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901419 4778 flags.go:64] FLAG: --cloud-config="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901429 4778 flags.go:64] FLAG: --cloud-provider="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901438 4778 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901458 4778 flags.go:64] FLAG: --cluster-domain="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901468 4778 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901478 4778 flags.go:64] FLAG: --config-dir="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901487 4778 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901497 4778 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901516 4778 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901526 4778 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901535 4778 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901545 4778 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901554 4778 flags.go:64] FLAG: --contention-profiling="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901589 4778 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902415 4778 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902441 4778 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902455 4778 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902488 4778 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902502 4778 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902515 4778 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902526 4778 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902539 4778 flags.go:64] FLAG: --enable-server="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902550 4778 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902568 4778 flags.go:64] FLAG: --event-burst="100" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902581 4778 flags.go:64] FLAG: --event-qps="50" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902593 4778 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902605 4778 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902617 4778 flags.go:64] FLAG: --eviction-hard="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902632 4778 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902644 4778 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902654 4778 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902666 4778 flags.go:64] FLAG: --eviction-soft="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902677 4778 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902689 4778 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902701 4778 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902713 4778 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902724 4778 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902736 4778 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902747 4778 flags.go:64] FLAG: --feature-gates="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902762 4778 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902774 4778 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902786 4778 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902813 4778 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902826 4778 flags.go:64] FLAG: --healthz-port="10248" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902838 4778 flags.go:64] FLAG: --help="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902850 4778 flags.go:64] FLAG: --hostname-override="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902861 4778 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902873 4778 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902886 4778 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902900 4778 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902911 4778 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902923 4778 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902935 4778 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902945 4778 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902957 4778 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902968 4778 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902981 4778 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902993 4778 flags.go:64] FLAG: --kube-reserved="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903005 4778 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903016 4778 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903028 4778 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903039 4778 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903052 4778 flags.go:64] FLAG: --lock-file="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903063 4778 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903075 4778 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903087 4778 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903109 4778 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903120 4778 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903132 4778 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903143 4778 flags.go:64] FLAG: --logging-format="text" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903155 4778 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903168 4778 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903179 4778 flags.go:64] FLAG: --manifest-url="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903190 4778 flags.go:64] FLAG: --manifest-url-header="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903245 4778 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903258 4778 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903273 4778 flags.go:64] FLAG: --max-pods="110" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903285 4778 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903297 4778 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903308 4778 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903321 4778 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903336 4778 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903347 4778 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903361 4778 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903393 4778 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903405 4778 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903417 4778 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903428 4778 flags.go:64] FLAG: --pod-cidr="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903439 4778 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903455 4778 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903467 4778 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903479 4778 flags.go:64] FLAG: --pods-per-core="0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903491 4778 flags.go:64] FLAG: --port="10250" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903502 4778 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903514 4778 flags.go:64] FLAG: --provider-id="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903526 4778 flags.go:64] FLAG: --qos-reserved="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903538 4778 flags.go:64] FLAG: --read-only-port="10255" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903549 4778 flags.go:64] FLAG: --register-node="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903560 4778 flags.go:64] FLAG: --register-schedulable="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903571 4778 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903594 4778 flags.go:64] FLAG: --registry-burst="10" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903604 4778 flags.go:64] FLAG: --registry-qps="5" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903616 4778 flags.go:64] FLAG: --reserved-cpus="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903627 4778 flags.go:64] FLAG: --reserved-memory="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903641 4778 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903654 4778 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903666 4778 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903676 4778 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903687 4778 flags.go:64] FLAG: --runonce="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903697 4778 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903708 4778 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903721 4778 flags.go:64] FLAG: --seccomp-default="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903732 4778 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903745 4778 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903757 4778 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903770 4778 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903782 4778 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903793 4778 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903804 4778 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903816 4778 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903827 4778 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903838 4778 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903850 4778 flags.go:64] FLAG: --system-cgroups="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903861 4778 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903881 4778 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903893 4778 flags.go:64] FLAG: --tls-cert-file="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903904 4778 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903919 4778 flags.go:64] FLAG: --tls-min-version="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903930 4778 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903942 4778 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903952 4778 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903963 4778 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903975 4778 flags.go:64] FLAG: --v="2" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.904006 4778 flags.go:64] FLAG: --version="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.904021 4778 flags.go:64] FLAG: --vmodule="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.904036 4778 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.904048 4778 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904386 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904401 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904415 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904427 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904441 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904457 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904468 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904478 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904489 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904499 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904513 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904526 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904538 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904552 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904564 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904574 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904588 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904598 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904608 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904618 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904628 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904640 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904650 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904660 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904670 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904680 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904690 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904699 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904709 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904718 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904728 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904738 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904748 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904757 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904767 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904776 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904786 4778 feature_gate.go:330] unrecognized feature gate: Example Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904799 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904809 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904819 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904830 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904840 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904850 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904859 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904869 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904878 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904888 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904898 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904907 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904920 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904930 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904939 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904949 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904959 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904968 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904978 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904992 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905004 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905015 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905025 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905034 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905044 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905053 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905062 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905072 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905081 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905090 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905099 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905109 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905118 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905127 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.906253 4778 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.920355 4778 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.920412 4778 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920569 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920589 4778 feature_gate.go:330] unrecognized feature gate: Example Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920602 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920614 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920625 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920636 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920647 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920659 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920671 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920682 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920697 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920713 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920724 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920735 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920746 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920756 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920768 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920778 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920787 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920798 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920808 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920817 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920828 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920839 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920848 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920857 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920868 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920884 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920898 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920908 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920919 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920930 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920940 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920951 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920967 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920977 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920987 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920999 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921009 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921019 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921029 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921043 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921056 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921070 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921084 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921122 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921134 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921145 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921156 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921167 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921178 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921250 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921262 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921273 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921283 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921293 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921303 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921313 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921323 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921333 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921343 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921353 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921366 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921378 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921389 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921399 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921408 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921417 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921428 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921439 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921450 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.921468 4778 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921747 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921769 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921781 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921792 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921805 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921815 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921829 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921841 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921851 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921865 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921878 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921888 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921898 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921908 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921920 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921932 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921942 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921953 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921964 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921975 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921988 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921999 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922010 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922020 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922032 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922042 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922053 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922063 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922073 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922083 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922094 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922104 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922114 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922124 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922137 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922147 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922156 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922166 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922177 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922187 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922232 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922243 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922253 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922265 4778 feature_gate.go:330] unrecognized feature gate: Example Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922274 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922284 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922295 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922304 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922314 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922324 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922333 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922343 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922353 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922363 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922373 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922383 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922392 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922402 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922411 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922423 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922433 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922446 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922456 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922466 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922477 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922486 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922496 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922510 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922522 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922534 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922546 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.922564 4778 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.924527 4778 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 09:02:13 crc kubenswrapper[4778]: E0318 09:02:13.931276 4778 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.936030 4778 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.936249 4778 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.938928 4778 server.go:997] "Starting client certificate rotation" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.938968 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.939246 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.970531 4778 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.973093 4778 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 09:02:13 crc kubenswrapper[4778]: E0318 09:02:13.973904 4778 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.000416 4778 log.go:25] "Validated CRI v1 runtime API" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.045923 4778 log.go:25] "Validated CRI v1 image API" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.048397 4778 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.053339 4778 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-08-57-09-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.053402 4778 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.066554 4778 manager.go:217] Machine: {Timestamp:2026-03-18 09:02:14.064518792 +0000 UTC m=+0.639263652 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4e5f6a1b-325c-4eb3-9961-e93f55b97b93 BootID:09c4ac70-7aed-4b4e-97f0-04cc523320b9 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a8:27:26 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a8:27:26 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:69:0d:ac Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:15:6e:16 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a5:13:66 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:70:db:86 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:2a:87:12 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5e:42:a9:33:64:59 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:86:65:56:17:3c:1f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.066750 4778 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.066846 4778 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.067471 4778 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.067715 4778 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.067768 4778 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.068871 4778 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.068898 4778 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.069518 4778 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.069554 4778 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.069812 4778 state_mem.go:36] "Initialized new in-memory state store" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.069923 4778 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.073456 4778 kubelet.go:418] "Attempting to sync node with API server" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.073486 4778 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.073516 4778 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.073533 4778 kubelet.go:324] "Adding apiserver pod source" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.073545 4778 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.078441 4778 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.079411 4778 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.079964 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.080105 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.079983 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.080256 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.082072 4778 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084387 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084426 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084439 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084450 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084466 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084476 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084488 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084505 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084517 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084531 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084560 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084569 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.085649 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.087177 4778 server.go:1280] "Started kubelet" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.087241 4778 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.087361 4778 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.088319 4778 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 09:02:14 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.100540 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.102194 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.102295 4778 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.103031 4778 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.103009 4778 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.103181 4778 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.103104 4778 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.104416 4778 factory.go:55] Registering systemd factory Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.104526 4778 factory.go:221] Registration of the systemd container factory successfully Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.104607 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="200ms" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.104965 4778 server.go:460] "Adding debug handlers to kubelet server" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.109702 4778 factory.go:153] Registering CRI-O factory Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.109742 4778 factory.go:221] Registration of the crio container factory successfully Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.109840 4778 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.110253 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.110410 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.109918 4778 factory.go:103] Registering Raw factory Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.110759 4778 manager.go:1196] Started watching for new ooms in manager Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.111716 4778 manager.go:319] Starting recovery of all containers Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.111062 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.70:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123020 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123080 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123096 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123109 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123122 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123136 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123162 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123175 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123218 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123234 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123247 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123261 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123276 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123292 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123303 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123316 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123336 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123352 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123400 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123411 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123428 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123439 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123451 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123481 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123506 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123518 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123530 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123542 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123553 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123565 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123575 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123587 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123788 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123804 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123818 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123835 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123851 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123865 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123879 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123893 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123909 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123921 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123935 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123947 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123965 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123980 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123994 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124006 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124023 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124034 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124046 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124056 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124072 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124086 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124100 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124114 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124127 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124140 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124150 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124161 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124176 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124190 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124229 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124241 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125261 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125286 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125301 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125314 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125327 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125338 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125351 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125363 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125374 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125385 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125399 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125410 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125425 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125461 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125479 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125492 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125504 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125519 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125532 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125543 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125555 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125567 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125578 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125593 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125605 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125618 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125632 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125643 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125655 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125666 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125679 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125695 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125710 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125722 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125735 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125800 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125814 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125827 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125842 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125853 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125875 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125891 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125904 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125919 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125932 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125947 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125961 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125976 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125989 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126002 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126013 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126031 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126043 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126057 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126069 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126080 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126093 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126105 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126117 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126130 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126142 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126157 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126169 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126182 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126211 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126259 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126275 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126287 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126301 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126315 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126325 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126335 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126346 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126357 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126367 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126376 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126386 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126398 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126408 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126420 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126430 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126442 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126453 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126466 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126476 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126487 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126498 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126508 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126517 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126530 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126541 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126552 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126563 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126573 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126585 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126596 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126607 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126618 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126635 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126647 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126659 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126672 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126683 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126693 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126706 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126718 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126728 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126740 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126751 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126763 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126774 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126785 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126796 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126808 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126819 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126830 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126840 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126851 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126861 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126871 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126884 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126895 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126905 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126917 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126927 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126939 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126952 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129369 4778 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129402 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129415 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129426 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129463 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129473 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129483 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129493 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129505 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129540 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129551 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129561 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129572 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129582 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129592 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129605 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129616 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129627 4778 reconstruct.go:97] "Volume reconstruction finished" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129635 4778 reconciler.go:26] "Reconciler: start to sync state" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.148133 4778 manager.go:324] Recovery completed Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.159459 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.161145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.161184 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.161212 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.161878 4778 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.161898 4778 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.161919 4778 state_mem.go:36] "Initialized new in-memory state store" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.176964 4778 policy_none.go:49] "None policy: Start" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.179666 4778 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.179720 4778 state_mem.go:35] "Initializing new in-memory state store" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.182459 4778 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.185796 4778 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.185838 4778 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.185875 4778 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.185928 4778 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.187681 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.187742 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.203478 4778 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.255692 4778 manager.go:334] "Starting Device Plugin manager" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.255755 4778 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.255772 4778 server.go:79] "Starting device plugin registration server" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.256230 4778 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.256244 4778 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.256556 4778 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.256633 4778 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.256640 4778 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.265169 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.286347 4778 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.286481 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.287929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.287966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.287977 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.288121 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.288429 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.288508 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289080 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289173 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289384 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289475 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289842 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289857 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.290108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.290151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.290172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.290425 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.290635 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.290698 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.291110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.291127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.291136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.293766 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.293876 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.293897 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.294313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.294354 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.294376 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.294610 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.294868 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.294944 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.296481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.296546 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.296568 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.296883 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.296955 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.297593 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.297662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.297678 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.298334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.298377 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.298391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.305775 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="400ms" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331417 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331505 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331535 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331583 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331612 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331713 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331808 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331858 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331888 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331918 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331948 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331974 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331997 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.332019 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.332041 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.359004 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.360816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.360893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.360910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.360945 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.361922 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.70:6443: connect: connection refused" node="crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.433807 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434107 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434136 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434237 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434442 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434470 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434493 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434524 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434546 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434540 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434599 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434585 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434564 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434613 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434643 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434691 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434723 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434762 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434799 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434791 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434873 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434894 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434880 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434849 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434980 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.435009 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.435039 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.435135 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.435334 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.562567 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.564272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.564529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.564680 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.564873 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.565627 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.70:6443: connect: connection refused" node="crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.631616 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.640576 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.660988 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.669579 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.674814 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.688648 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-523f305bbaa14bef6a11c883b99104dd5f8bbdf21ecb2794d77cbbc112602d05 WatchSource:0}: Error finding container 523f305bbaa14bef6a11c883b99104dd5f8bbdf21ecb2794d77cbbc112602d05: Status 404 returned error can't find the container with id 523f305bbaa14bef6a11c883b99104dd5f8bbdf21ecb2794d77cbbc112602d05 Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.691287 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f8eee9aad49d4f5c64cf7769bb23d963c1d47bc628b49c4c2e9cc2ed6f11d9ec WatchSource:0}: Error finding container f8eee9aad49d4f5c64cf7769bb23d963c1d47bc628b49c4c2e9cc2ed6f11d9ec: Status 404 returned error can't find the container with id f8eee9aad49d4f5c64cf7769bb23d963c1d47bc628b49c4c2e9cc2ed6f11d9ec Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.699717 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-9634940ffdccff7e80603d683b6400ede86b40098940697f2f52d7cde2168c16 WatchSource:0}: Error finding container 9634940ffdccff7e80603d683b6400ede86b40098940697f2f52d7cde2168c16: Status 404 returned error can't find the container with id 9634940ffdccff7e80603d683b6400ede86b40098940697f2f52d7cde2168c16 Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.705920 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a3fb0861f8792b15e1914aacdcd3749032aeeb584bff765731cfbe226fcd9752 WatchSource:0}: Error finding container a3fb0861f8792b15e1914aacdcd3749032aeeb584bff765731cfbe226fcd9752: Status 404 returned error can't find the container with id a3fb0861f8792b15e1914aacdcd3749032aeeb584bff765731cfbe226fcd9752 Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.706359 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="800ms" Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.707781 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-47ed5e6cd5c13985cfc22ee9570b8fb0e80a38e0d6f4ce510bf9458b6aadcc5a WatchSource:0}: Error finding container 47ed5e6cd5c13985cfc22ee9570b8fb0e80a38e0d6f4ce510bf9458b6aadcc5a: Status 404 returned error can't find the container with id 47ed5e6cd5c13985cfc22ee9570b8fb0e80a38e0d6f4ce510bf9458b6aadcc5a Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.966105 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.967407 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.967444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.967456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.967491 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.967928 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.70:6443: connect: connection refused" node="crc" Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.102280 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.190458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a3fb0861f8792b15e1914aacdcd3749032aeeb584bff765731cfbe226fcd9752"} Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.191463 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9634940ffdccff7e80603d683b6400ede86b40098940697f2f52d7cde2168c16"} Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.192649 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"523f305bbaa14bef6a11c883b99104dd5f8bbdf21ecb2794d77cbbc112602d05"} Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.193536 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f8eee9aad49d4f5c64cf7769bb23d963c1d47bc628b49c4c2e9cc2ed6f11d9ec"} Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.194309 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"47ed5e6cd5c13985cfc22ee9570b8fb0e80a38e0d6f4ce510bf9458b6aadcc5a"} Mar 18 09:02:15 crc kubenswrapper[4778]: W0318 09:02:15.267647 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:15 crc kubenswrapper[4778]: E0318 09:02:15.268072 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:15 crc kubenswrapper[4778]: W0318 09:02:15.270024 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:15 crc kubenswrapper[4778]: E0318 09:02:15.270120 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:15 crc kubenswrapper[4778]: E0318 09:02:15.507018 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="1.6s" Mar 18 09:02:15 crc kubenswrapper[4778]: W0318 09:02:15.565176 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:15 crc kubenswrapper[4778]: E0318 09:02:15.565320 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:15 crc kubenswrapper[4778]: W0318 09:02:15.573759 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:15 crc kubenswrapper[4778]: E0318 09:02:15.573839 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.768458 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.770551 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.770599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.770681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.770751 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:15 crc kubenswrapper[4778]: E0318 09:02:15.771415 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.70:6443: connect: connection refused" node="crc" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.102788 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.106266 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 09:02:16 crc kubenswrapper[4778]: E0318 09:02:16.107517 4778 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.200787 4778 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5" exitCode=0 Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.200953 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.200971 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.202492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.202670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.202871 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.203482 4778 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce" exitCode=0 Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.203612 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.203665 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.204854 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.204912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.204934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.206698 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4" exitCode=0 Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.206757 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.206844 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.207973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.207996 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.208006 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.210820 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88" exitCode=0 Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.210883 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.210946 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.211927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.211984 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.212012 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.214138 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.214978 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215023 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215045 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c06152147e770c821660f00750e87810309b3aa8dbbd6e1ee031be349b3cf44e"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215064 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215135 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215299 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215321 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.216925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.216963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.216979 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.790734 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.102346 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:17 crc kubenswrapper[4778]: E0318 09:02:17.107978 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="3.2s" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.227398 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.227464 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.227475 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.227514 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.229079 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.229112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.229124 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.230257 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.230283 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.231434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.231462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.231473 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.235171 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.235254 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.235267 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.235277 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.237885 4778 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988" exitCode=0 Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.238004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.238017 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.238088 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.242690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.242739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.242744 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.242772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.242794 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.242753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.372075 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.373637 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.373693 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.373705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.373743 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:17 crc kubenswrapper[4778]: E0318 09:02:17.374330 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.70:6443: connect: connection refused" node="crc" Mar 18 09:02:17 crc kubenswrapper[4778]: W0318 09:02:17.571300 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:17 crc kubenswrapper[4778]: E0318 09:02:17.571398 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:17 crc kubenswrapper[4778]: W0318 09:02:17.740580 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:17 crc kubenswrapper[4778]: E0318 09:02:17.740680 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.133527 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.243081 4778 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1" exitCode=0 Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.243180 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1"} Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.243221 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.244515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.244550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.244558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.248614 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"afe66da1be5d0bac4307a8d0d57524c3eeac9e7cfc3a44cfd27bfa2874fca59c"} Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.248693 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.248717 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.248747 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.248744 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.248834 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.249780 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.249802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.249812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.249953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.249978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.249988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.250870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.250888 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.250897 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.251172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.251233 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.251251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261216 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb"} Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261296 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2"} Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261320 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644"} Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261338 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72"} Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261339 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261510 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261604 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.263404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.263457 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.263481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.263623 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.263690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.263704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.492731 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.800737 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.809735 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.919839 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.270118 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.270799 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b"} Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.270874 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.271016 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.271382 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.271434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.271448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.271994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.272077 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.272105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.272509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.272564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.272583 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.429065 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.575130 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.576721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.576777 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.576804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.576856 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.275511 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.275640 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.275800 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277586 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277777 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.284759 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.100107 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.282559 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.282636 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.284424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.284489 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.284512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.284430 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.284749 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.284785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:24 crc kubenswrapper[4778]: E0318 09:02:24.265343 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:02:24 crc kubenswrapper[4778]: I0318 09:02:24.451126 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:24 crc kubenswrapper[4778]: I0318 09:02:24.451509 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:24 crc kubenswrapper[4778]: I0318 09:02:24.453509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:24 crc kubenswrapper[4778]: I0318 09:02:24.453568 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:24 crc kubenswrapper[4778]: I0318 09:02:24.453586 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:25 crc kubenswrapper[4778]: I0318 09:02:25.101529 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:02:25 crc kubenswrapper[4778]: I0318 09:02:25.101651 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.103056 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 18 09:02:28 crc kubenswrapper[4778]: W0318 09:02:28.116119 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.116298 4778 trace.go:236] Trace[1700483078]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 09:02:18.114) (total time: 10001ms): Mar 18 09:02:28 crc kubenswrapper[4778]: Trace[1700483078]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:02:28.116) Mar 18 09:02:28 crc kubenswrapper[4778]: Trace[1700483078]: [10.001311468s] [10.001311468s] END Mar 18 09:02:28 crc kubenswrapper[4778]: E0318 09:02:28.116335 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.127908 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38612->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.128030 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38612->192.168.126.11:17697: read: connection reset by peer" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.144927 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.145760 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.148958 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.149009 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.149026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.301414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.305545 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="afe66da1be5d0bac4307a8d0d57524c3eeac9e7cfc3a44cfd27bfa2874fca59c" exitCode=255 Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.305629 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"afe66da1be5d0bac4307a8d0d57524c3eeac9e7cfc3a44cfd27bfa2874fca59c"} Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.305958 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.307603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.307662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.307690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.308776 4778 scope.go:117] "RemoveContainer" containerID="afe66da1be5d0bac4307a8d0d57524c3eeac9e7cfc3a44cfd27bfa2874fca59c" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.557346 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.557603 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.562683 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.562718 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.562732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:28 crc kubenswrapper[4778]: W0318 09:02:28.606673 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.606793 4778 trace.go:236] Trace[1267424385]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 09:02:18.604) (total time: 10002ms): Mar 18 09:02:28 crc kubenswrapper[4778]: Trace[1267424385]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (09:02:28.606) Mar 18 09:02:28 crc kubenswrapper[4778]: Trace[1267424385]: [10.002259931s] [10.002259931s] END Mar 18 09:02:28 crc kubenswrapper[4778]: E0318 09:02:28.606828 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.610672 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.315753 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.320005 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263"} Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.320300 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.320377 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.321795 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.321843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.321855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.322108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.322312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.322481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.346231 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.711536 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z Mar 18 09:02:29 crc kubenswrapper[4778]: E0318 09:02:29.712410 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 18 09:02:29 crc kubenswrapper[4778]: E0318 09:02:29.720961 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:02:29 crc kubenswrapper[4778]: E0318 09:02:29.721108 4778 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.725479 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.725691 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 09:02:29 crc kubenswrapper[4778]: E0318 09:02:29.729369 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:02:29 crc kubenswrapper[4778]: W0318 09:02:29.731731 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z Mar 18 09:02:29 crc kubenswrapper[4778]: E0318 09:02:29.731813 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.747362 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.747471 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 09:02:29 crc kubenswrapper[4778]: W0318 09:02:29.750046 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z Mar 18 09:02:29 crc kubenswrapper[4778]: E0318 09:02:29.750152 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.930030 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]log ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]etcd ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/priority-and-fairness-filter ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-apiextensions-informers ok Mar 18 09:02:29 crc kubenswrapper[4778]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 18 09:02:29 crc kubenswrapper[4778]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-system-namespaces-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 18 09:02:29 crc kubenswrapper[4778]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 18 09:02:29 crc kubenswrapper[4778]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/bootstrap-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-kube-aggregator-informers ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 18 09:02:29 crc kubenswrapper[4778]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]autoregister-completion ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/apiservice-openapi-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: livez check failed Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.930114 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.104434 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:30Z is after 2026-02-23T05:33:13Z Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.325444 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.327216 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.329700 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" exitCode=255 Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.329841 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.329804 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263"} Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.329938 4778 scope.go:117] "RemoveContainer" containerID="afe66da1be5d0bac4307a8d0d57524c3eeac9e7cfc3a44cfd27bfa2874fca59c" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.330235 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.330785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.330830 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.330846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.331591 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.331623 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.331633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.332180 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:30 crc kubenswrapper[4778]: E0318 09:02:30.332356 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:31 crc kubenswrapper[4778]: I0318 09:02:31.104065 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:31Z is after 2026-02-23T05:33:13Z Mar 18 09:02:31 crc kubenswrapper[4778]: I0318 09:02:31.335551 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 09:02:32 crc kubenswrapper[4778]: I0318 09:02:32.106177 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:32Z is after 2026-02-23T05:33:13Z Mar 18 09:02:32 crc kubenswrapper[4778]: W0318 09:02:32.720357 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:32Z is after 2026-02-23T05:33:13Z Mar 18 09:02:32 crc kubenswrapper[4778]: E0318 09:02:32.720504 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.107704 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:33Z is after 2026-02-23T05:33:13Z Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.425988 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.426307 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.428032 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.428104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.428122 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.429046 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:33 crc kubenswrapper[4778]: E0318 09:02:33.429314 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.105691 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:34Z is after 2026-02-23T05:33:13Z Mar 18 09:02:34 crc kubenswrapper[4778]: W0318 09:02:34.201310 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:34Z is after 2026-02-23T05:33:13Z Mar 18 09:02:34 crc kubenswrapper[4778]: E0318 09:02:34.201429 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:34 crc kubenswrapper[4778]: E0318 09:02:34.266104 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.928900 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.929106 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.930587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.930620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.930631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.931127 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:34 crc kubenswrapper[4778]: E0318 09:02:34.931485 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.941453 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.101463 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.101565 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.105245 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:35Z is after 2026-02-23T05:33:13Z Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.351104 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.352610 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.352671 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.352693 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.353626 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:35 crc kubenswrapper[4778]: E0318 09:02:35.353919 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:36 crc kubenswrapper[4778]: I0318 09:02:36.105554 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:36Z is after 2026-02-23T05:33:13Z Mar 18 09:02:36 crc kubenswrapper[4778]: E0318 09:02:36.116973 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:36Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 09:02:36 crc kubenswrapper[4778]: I0318 09:02:36.121134 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:36 crc kubenswrapper[4778]: I0318 09:02:36.123151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:36 crc kubenswrapper[4778]: I0318 09:02:36.123230 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:36 crc kubenswrapper[4778]: I0318 09:02:36.123248 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:36 crc kubenswrapper[4778]: I0318 09:02:36.123291 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:36 crc kubenswrapper[4778]: E0318 09:02:36.128317 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:36Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.105361 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:37Z is after 2026-02-23T05:33:13Z Mar 18 09:02:37 crc kubenswrapper[4778]: W0318 09:02:37.392006 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:37Z is after 2026-02-23T05:33:13Z Mar 18 09:02:37 crc kubenswrapper[4778]: E0318 09:02:37.392117 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.908343 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.908572 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.910095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.910183 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.910255 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.911106 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:37 crc kubenswrapper[4778]: E0318 09:02:37.911505 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:38 crc kubenswrapper[4778]: I0318 09:02:38.095677 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 09:02:38 crc kubenswrapper[4778]: E0318 09:02:38.101878 4778 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:38 crc kubenswrapper[4778]: I0318 09:02:38.106463 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:38Z is after 2026-02-23T05:33:13Z Mar 18 09:02:38 crc kubenswrapper[4778]: W0318 09:02:38.599840 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:38Z is after 2026-02-23T05:33:13Z Mar 18 09:02:38 crc kubenswrapper[4778]: E0318 09:02:38.599951 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:39 crc kubenswrapper[4778]: I0318 09:02:39.104288 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:39Z is after 2026-02-23T05:33:13Z Mar 18 09:02:39 crc kubenswrapper[4778]: E0318 09:02:39.734577 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:39Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:02:40 crc kubenswrapper[4778]: I0318 09:02:40.106764 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:40Z is after 2026-02-23T05:33:13Z Mar 18 09:02:41 crc kubenswrapper[4778]: I0318 09:02:41.106374 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:41Z is after 2026-02-23T05:33:13Z Mar 18 09:02:42 crc kubenswrapper[4778]: I0318 09:02:42.106400 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:42Z is after 2026-02-23T05:33:13Z Mar 18 09:02:42 crc kubenswrapper[4778]: W0318 09:02:42.846312 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:42Z is after 2026-02-23T05:33:13Z Mar 18 09:02:42 crc kubenswrapper[4778]: E0318 09:02:42.846444 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:43 crc kubenswrapper[4778]: I0318 09:02:43.104958 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:43Z is after 2026-02-23T05:33:13Z Mar 18 09:02:43 crc kubenswrapper[4778]: E0318 09:02:43.120168 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:43Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 09:02:43 crc kubenswrapper[4778]: I0318 09:02:43.129299 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:43 crc kubenswrapper[4778]: I0318 09:02:43.131228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:43 crc kubenswrapper[4778]: I0318 09:02:43.131298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:43 crc kubenswrapper[4778]: I0318 09:02:43.131313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:43 crc kubenswrapper[4778]: I0318 09:02:43.131358 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:43 crc kubenswrapper[4778]: E0318 09:02:43.134618 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:43Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:02:44 crc kubenswrapper[4778]: I0318 09:02:44.107989 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:44Z is after 2026-02-23T05:33:13Z Mar 18 09:02:44 crc kubenswrapper[4778]: E0318 09:02:44.266338 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:02:44 crc kubenswrapper[4778]: W0318 09:02:44.789109 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:44Z is after 2026-02-23T05:33:13Z Mar 18 09:02:44 crc kubenswrapper[4778]: E0318 09:02:44.789222 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.100847 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.100937 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.101013 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.101242 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.102805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.102899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.102918 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.103773 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"c06152147e770c821660f00750e87810309b3aa8dbbd6e1ee031be349b3cf44e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.104094 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://c06152147e770c821660f00750e87810309b3aa8dbbd6e1ee031be349b3cf44e" gracePeriod=30 Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.106746 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:45Z is after 2026-02-23T05:33:13Z Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.385332 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.385959 4778 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c06152147e770c821660f00750e87810309b3aa8dbbd6e1ee031be349b3cf44e" exitCode=255 Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.386018 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c06152147e770c821660f00750e87810309b3aa8dbbd6e1ee031be349b3cf44e"} Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.106669 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:46Z is after 2026-02-23T05:33:13Z Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.394665 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.395422 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5"} Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.395597 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.396829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.396884 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.396942 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.791304 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:47 crc kubenswrapper[4778]: I0318 09:02:47.107635 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:47Z is after 2026-02-23T05:33:13Z Mar 18 09:02:47 crc kubenswrapper[4778]: I0318 09:02:47.398903 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:47 crc kubenswrapper[4778]: I0318 09:02:47.406423 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:47 crc kubenswrapper[4778]: I0318 09:02:47.406492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:47 crc kubenswrapper[4778]: I0318 09:02:47.406512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:48 crc kubenswrapper[4778]: I0318 09:02:48.107149 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:48Z is after 2026-02-23T05:33:13Z Mar 18 09:02:48 crc kubenswrapper[4778]: I0318 09:02:48.401895 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:48 crc kubenswrapper[4778]: I0318 09:02:48.403272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:48 crc kubenswrapper[4778]: I0318 09:02:48.403319 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:48 crc kubenswrapper[4778]: I0318 09:02:48.403341 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:49 crc kubenswrapper[4778]: I0318 09:02:49.106121 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:49Z is after 2026-02-23T05:33:13Z Mar 18 09:02:49 crc kubenswrapper[4778]: E0318 09:02:49.740915 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:49Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:02:50 crc kubenswrapper[4778]: I0318 09:02:50.107088 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:50Z is after 2026-02-23T05:33:13Z Mar 18 09:02:50 crc kubenswrapper[4778]: E0318 09:02:50.124248 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:50Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 09:02:50 crc kubenswrapper[4778]: I0318 09:02:50.135500 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:50 crc kubenswrapper[4778]: I0318 09:02:50.136793 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:50 crc kubenswrapper[4778]: I0318 09:02:50.136845 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:50 crc kubenswrapper[4778]: I0318 09:02:50.136856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:50 crc kubenswrapper[4778]: I0318 09:02:50.136876 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:50 crc kubenswrapper[4778]: E0318 09:02:50.142515 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:50Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.104323 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:51Z is after 2026-02-23T05:33:13Z Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.186310 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.188060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.188130 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.188144 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.189035 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.412953 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.099971 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.100239 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.101661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.101743 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.101763 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.103564 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:52Z is after 2026-02-23T05:33:13Z Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.421796 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.422752 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.425676 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc" exitCode=255 Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.425749 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc"} Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.425823 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.425988 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.427662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.427732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.427750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.428568 4778 scope.go:117] "RemoveContainer" containerID="bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc" Mar 18 09:02:52 crc kubenswrapper[4778]: E0318 09:02:52.428853 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.104968 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:53Z is after 2026-02-23T05:33:13Z Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.426943 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.432138 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.435510 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.437104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.437151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.437165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.437941 4778 scope.go:117] "RemoveContainer" containerID="bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc" Mar 18 09:02:53 crc kubenswrapper[4778]: E0318 09:02:53.438178 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:54 crc kubenswrapper[4778]: I0318 09:02:54.104861 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:54Z is after 2026-02-23T05:33:13Z Mar 18 09:02:54 crc kubenswrapper[4778]: E0318 09:02:54.266693 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:02:55 crc kubenswrapper[4778]: I0318 09:02:55.083810 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 09:02:55 crc kubenswrapper[4778]: E0318 09:02:55.088659 4778 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:55 crc kubenswrapper[4778]: E0318 09:02:55.089913 4778 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 18 09:02:55 crc kubenswrapper[4778]: I0318 09:02:55.100627 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:02:55 crc kubenswrapper[4778]: I0318 09:02:55.100708 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:02:55 crc kubenswrapper[4778]: I0318 09:02:55.106942 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:55Z is after 2026-02-23T05:33:13Z Mar 18 09:02:56 crc kubenswrapper[4778]: I0318 09:02:56.107401 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:56Z is after 2026-02-23T05:33:13Z Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.105079 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:57Z is after 2026-02-23T05:33:13Z Mar 18 09:02:57 crc kubenswrapper[4778]: E0318 09:02:57.130783 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:57Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.142910 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.144450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.144599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.144721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.144862 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:57 crc kubenswrapper[4778]: E0318 09:02:57.151146 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:57Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.907968 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.908264 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.909994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.910054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.910076 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.911085 4778 scope.go:117] "RemoveContainer" containerID="bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc" Mar 18 09:02:57 crc kubenswrapper[4778]: E0318 09:02:57.911399 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:58 crc kubenswrapper[4778]: I0318 09:02:58.104586 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:58Z is after 2026-02-23T05:33:13Z Mar 18 09:02:59 crc kubenswrapper[4778]: I0318 09:02:59.107603 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:59Z is after 2026-02-23T05:33:13Z Mar 18 09:02:59 crc kubenswrapper[4778]: E0318 09:02:59.746826 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:59Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:02:59 crc kubenswrapper[4778]: W0318 09:02:59.911392 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:59Z is after 2026-02-23T05:33:13Z Mar 18 09:02:59 crc kubenswrapper[4778]: E0318 09:02:59.911510 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:03:00 crc kubenswrapper[4778]: I0318 09:03:00.106621 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:00Z is after 2026-02-23T05:33:13Z Mar 18 09:03:00 crc kubenswrapper[4778]: W0318 09:03:00.564018 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:00Z is after 2026-02-23T05:33:13Z Mar 18 09:03:00 crc kubenswrapper[4778]: E0318 09:03:00.564131 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:03:01 crc kubenswrapper[4778]: W0318 09:03:01.100852 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:01Z is after 2026-02-23T05:33:13Z Mar 18 09:03:01 crc kubenswrapper[4778]: E0318 09:03:01.100945 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:03:01 crc kubenswrapper[4778]: I0318 09:03:01.105983 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:01Z is after 2026-02-23T05:33:13Z Mar 18 09:03:02 crc kubenswrapper[4778]: I0318 09:03:02.107317 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:02Z is after 2026-02-23T05:33:13Z Mar 18 09:03:03 crc kubenswrapper[4778]: I0318 09:03:03.106683 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:03Z is after 2026-02-23T05:33:13Z Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.107108 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:04Z is after 2026-02-23T05:33:13Z Mar 18 09:03:04 crc kubenswrapper[4778]: E0318 09:03:04.137460 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:04Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.151851 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.153985 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.154071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.154090 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.154131 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:03:04 crc kubenswrapper[4778]: E0318 09:03:04.158561 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:04Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:03:04 crc kubenswrapper[4778]: E0318 09:03:04.267843 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.459951 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.460233 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.462081 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.462147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.462169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:05 crc kubenswrapper[4778]: I0318 09:03:05.100925 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:03:05 crc kubenswrapper[4778]: I0318 09:03:05.101023 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:03:05 crc kubenswrapper[4778]: I0318 09:03:05.104804 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:05Z is after 2026-02-23T05:33:13Z Mar 18 09:03:06 crc kubenswrapper[4778]: I0318 09:03:06.106892 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:06Z is after 2026-02-23T05:33:13Z Mar 18 09:03:07 crc kubenswrapper[4778]: I0318 09:03:07.105750 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:07Z is after 2026-02-23T05:33:13Z Mar 18 09:03:07 crc kubenswrapper[4778]: W0318 09:03:07.397394 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:07Z is after 2026-02-23T05:33:13Z Mar 18 09:03:07 crc kubenswrapper[4778]: E0318 09:03:07.397504 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:03:08 crc kubenswrapper[4778]: I0318 09:03:08.106919 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:08Z is after 2026-02-23T05:33:13Z Mar 18 09:03:09 crc kubenswrapper[4778]: I0318 09:03:09.106638 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:09Z is after 2026-02-23T05:33:13Z Mar 18 09:03:09 crc kubenswrapper[4778]: E0318 09:03:09.754585 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:09Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:10 crc kubenswrapper[4778]: I0318 09:03:10.107245 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:10Z is after 2026-02-23T05:33:13Z Mar 18 09:03:11 crc kubenswrapper[4778]: I0318 09:03:11.105875 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:11Z is after 2026-02-23T05:33:13Z Mar 18 09:03:11 crc kubenswrapper[4778]: E0318 09:03:11.140641 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:11Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 09:03:11 crc kubenswrapper[4778]: I0318 09:03:11.159435 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:11 crc kubenswrapper[4778]: I0318 09:03:11.160753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:11 crc kubenswrapper[4778]: I0318 09:03:11.160810 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:11 crc kubenswrapper[4778]: I0318 09:03:11.160824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:11 crc kubenswrapper[4778]: I0318 09:03:11.160852 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:03:11 crc kubenswrapper[4778]: E0318 09:03:11.164125 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:11Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.107695 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.186489 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.187802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.187944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.187966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.188908 4778 scope.go:117] "RemoveContainer" containerID="bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.500544 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.503625 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e"} Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.503841 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.505326 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.505405 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.505434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.106822 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.508340 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.509025 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.511530 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" exitCode=255 Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.511579 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e"} Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.511624 4778 scope.go:117] "RemoveContainer" containerID="bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.511818 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.513579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.513666 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.513697 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.514828 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:03:13 crc kubenswrapper[4778]: E0318 09:03:13.515306 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:03:14 crc kubenswrapper[4778]: I0318 09:03:14.107329 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:14 crc kubenswrapper[4778]: E0318 09:03:14.268518 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:03:14 crc kubenswrapper[4778]: I0318 09:03:14.517562 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.101523 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.101644 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.101930 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.102169 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.103828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.103910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.103933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.104915 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.105099 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5" gracePeriod=30 Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.109669 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.526428 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.528583 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.529308 4778 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5" exitCode=255 Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.529378 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5"} Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.529428 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89"} Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.529459 4778 scope.go:117] "RemoveContainer" containerID="c06152147e770c821660f00750e87810309b3aa8dbbd6e1ee031be349b3cf44e" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.529684 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.531150 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.531224 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.531238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.109448 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.533331 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.790930 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.791151 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.792870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.792936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.792962 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.107420 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.908315 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.908672 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.910485 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.910551 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.910577 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.911787 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:03:17 crc kubenswrapper[4778]: E0318 09:03:17.912153 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:03:18 crc kubenswrapper[4778]: I0318 09:03:18.107380 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:18 crc kubenswrapper[4778]: E0318 09:03:18.150347 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 09:03:18 crc kubenswrapper[4778]: I0318 09:03:18.164374 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:18 crc kubenswrapper[4778]: I0318 09:03:18.165727 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:18 crc kubenswrapper[4778]: I0318 09:03:18.165758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:18 crc kubenswrapper[4778]: I0318 09:03:18.165768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:18 crc kubenswrapper[4778]: I0318 09:03:18.165788 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:03:18 crc kubenswrapper[4778]: E0318 09:03:18.170075 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 09:03:19 crc kubenswrapper[4778]: I0318 09:03:19.109236 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.764645 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.771555 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.776660 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.781692 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.786184 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de4040a7e11bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.260879807 +0000 UTC m=+0.835624657,LastTimestamp:2026-03-18 09:02:14.260879807 +0000 UTC m=+0.835624657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.792546 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.287951336 +0000 UTC m=+0.862696176,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.797019 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.287973846 +0000 UTC m=+0.862718686,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.801527 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d5de5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.287983395 +0000 UTC m=+0.862728225,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.805515 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.289061094 +0000 UTC m=+0.863805934,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.812426 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.289077003 +0000 UTC m=+0.863821843,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.816529 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d5de5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.289085523 +0000 UTC m=+0.863830363,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.821595 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.28983492 +0000 UTC m=+0.864579760,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.826416 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.2898511 +0000 UTC m=+0.864595940,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.833670 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d5de5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.289864659 +0000 UTC m=+0.864609499,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.840580 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.290140431 +0000 UTC m=+0.864885301,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.845222 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.29016374 +0000 UTC m=+0.864908610,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.851934 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d5de5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.29018156 +0000 UTC m=+0.864926430,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.858661 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.291121641 +0000 UTC m=+0.865866481,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.865889 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.291133341 +0000 UTC m=+0.865878181,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.871221 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d5de5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.291141271 +0000 UTC m=+0.865886101,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.877270 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.293854019 +0000 UTC m=+0.868598879,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.883536 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.293887158 +0000 UTC m=+0.868632008,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.887650 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d5de5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.293908458 +0000 UTC m=+0.868653318,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.891823 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.294343255 +0000 UTC m=+0.869088095,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.896780 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.294362894 +0000 UTC m=+0.869107734,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.902526 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de40424802223 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.697222691 +0000 UTC m=+1.271967541,LastTimestamp:2026-03-18 09:02:14.697222691 +0000 UTC m=+1.271967541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.907336 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189de4042492af88 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.698438536 +0000 UTC m=+1.273183386,LastTimestamp:2026-03-18 09:02:14.698438536 +0000 UTC m=+1.273183386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.911045 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40424d911bd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.703051197 +0000 UTC m=+1.277796047,LastTimestamp:2026-03-18 09:02:14.703051197 +0000 UTC m=+1.277796047,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.917380 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de404256e6fbc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.712840124 +0000 UTC m=+1.287584954,LastTimestamp:2026-03-18 09:02:14.712840124 +0000 UTC m=+1.287584954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.921982 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4042590f1d8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.715101656 +0000 UTC m=+1.289846506,LastTimestamp:2026-03-18 09:02:14.715101656 +0000 UTC m=+1.289846506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.926605 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40447d87c6b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.290215531 +0000 UTC m=+1.864960371,LastTimestamp:2026-03-18 09:02:15.290215531 +0000 UTC m=+1.864960371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.931301 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de40447e253c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.290860481 +0000 UTC m=+1.865605361,LastTimestamp:2026-03-18 09:02:15.290860481 +0000 UTC m=+1.865605361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.935898 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40447e24e5d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.290859101 +0000 UTC m=+1.865603941,LastTimestamp:2026-03-18 09:02:15.290859101 +0000 UTC m=+1.865603941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.940609 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189de40447e34ecd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.290924749 +0000 UTC m=+1.865669589,LastTimestamp:2026-03-18 09:02:15.290924749 +0000 UTC m=+1.865669589,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.944946 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de40447e4e5a0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.291028896 +0000 UTC m=+1.865773736,LastTimestamp:2026-03-18 09:02:15.291028896 +0000 UTC m=+1.865773736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.951539 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de404489d0584 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.303095684 +0000 UTC m=+1.877840534,LastTimestamp:2026-03-18 09:02:15.303095684 +0000 UTC m=+1.877840534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.957878 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40448d20cf9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.306571001 +0000 UTC m=+1.881315851,LastTimestamp:2026-03-18 09:02:15.306571001 +0000 UTC m=+1.881315851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.962477 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de40448d30b43 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.306636099 +0000 UTC m=+1.881380939,LastTimestamp:2026-03-18 09:02:15.306636099 +0000 UTC m=+1.881380939,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.967980 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189de40448d6a152 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.306871122 +0000 UTC m=+1.881615972,LastTimestamp:2026-03-18 09:02:15.306871122 +0000 UTC m=+1.881615972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.973659 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40448e2d658 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.307671128 +0000 UTC m=+1.882415978,LastTimestamp:2026-03-18 09:02:15.307671128 +0000 UTC m=+1.882415978,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.980598 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40448f7c693 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.309043347 +0000 UTC m=+1.883788197,LastTimestamp:2026-03-18 09:02:15.309043347 +0000 UTC m=+1.883788197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.987448 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4045c2c6264 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.631258212 +0000 UTC m=+2.206003122,LastTimestamp:2026-03-18 09:02:15.631258212 +0000 UTC m=+2.206003122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.993562 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4045ce870c1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.643582657 +0000 UTC m=+2.218327537,LastTimestamp:2026-03-18 09:02:15.643582657 +0000 UTC m=+2.218327537,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.000441 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4045cfd2478 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.644939384 +0000 UTC m=+2.219684254,LastTimestamp:2026-03-18 09:02:15.644939384 +0000 UTC m=+2.219684254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.006406 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4046a8ff938 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.872665912 +0000 UTC m=+2.447410762,LastTimestamp:2026-03-18 09:02:15.872665912 +0000 UTC m=+2.447410762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.011229 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4046b3f1f0b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.884144395 +0000 UTC m=+2.458889245,LastTimestamp:2026-03-18 09:02:15.884144395 +0000 UTC m=+2.458889245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.015973 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4046b5aaf90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.885950864 +0000 UTC m=+2.460695744,LastTimestamp:2026-03-18 09:02:15.885950864 +0000 UTC m=+2.460695744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.020827 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4047919a255 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.116568661 +0000 UTC m=+2.691313581,LastTimestamp:2026-03-18 09:02:16.116568661 +0000 UTC m=+2.691313581,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.022595 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40479f098b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.130656432 +0000 UTC m=+2.705401272,LastTimestamp:2026-03-18 09:02:16.130656432 +0000 UTC m=+2.705401272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.027801 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4047e87721f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.207651359 +0000 UTC m=+2.782396199,LastTimestamp:2026-03-18 09:02:16.207651359 +0000 UTC m=+2.782396199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.033327 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4047e907b3c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.208243516 +0000 UTC m=+2.782988396,LastTimestamp:2026-03-18 09:02:16.208243516 +0000 UTC m=+2.782988396,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.038769 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189de4047ea52237 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.209596983 +0000 UTC m=+2.784341863,LastTimestamp:2026-03-18 09:02:16.209596983 +0000 UTC m=+2.784341863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.043364 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4047ee73632 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.213927474 +0000 UTC m=+2.788672344,LastTimestamp:2026-03-18 09:02:16.213927474 +0000 UTC m=+2.788672344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.049433 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189de4048c3c5b4e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.437611342 +0000 UTC m=+3.012356182,LastTimestamp:2026-03-18 09:02:16.437611342 +0000 UTC m=+3.012356182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.054536 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4048c8347ca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.442259402 +0000 UTC m=+3.017004252,LastTimestamp:2026-03-18 09:02:16.442259402 +0000 UTC m=+3.017004252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.068942 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4048ce9a67e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.448968318 +0000 UTC m=+3.023713158,LastTimestamp:2026-03-18 09:02:16.448968318 +0000 UTC m=+3.023713158,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.076853 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4048d962f7e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.460275582 +0000 UTC m=+3.035020422,LastTimestamp:2026-03-18 09:02:16.460275582 +0000 UTC m=+3.035020422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.083098 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189de4048dba4b6b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.462642027 +0000 UTC m=+3.037386867,LastTimestamp:2026-03-18 09:02:16.462642027 +0000 UTC m=+3.037386867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.087871 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4048dc42298 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.463286936 +0000 UTC m=+3.038031776,LastTimestamp:2026-03-18 09:02:16.463286936 +0000 UTC m=+3.038031776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.091783 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4048de52811 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.465451025 +0000 UTC m=+3.040195865,LastTimestamp:2026-03-18 09:02:16.465451025 +0000 UTC m=+3.040195865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.095598 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4048eef2c52 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.48288469 +0000 UTC m=+3.057629530,LastTimestamp:2026-03-18 09:02:16.48288469 +0000 UTC m=+3.057629530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.101125 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4048efc041b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.483726363 +0000 UTC m=+3.058471213,LastTimestamp:2026-03-18 09:02:16.483726363 +0000 UTC m=+3.058471213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: I0318 09:03:20.105483 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.105606 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4048f243812 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.486361106 +0000 UTC m=+3.061105946,LastTimestamp:2026-03-18 09:02:16.486361106 +0000 UTC m=+3.061105946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.109671 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4049952146c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.657138796 +0000 UTC m=+3.231883636,LastTimestamp:2026-03-18 09:02:16.657138796 +0000 UTC m=+3.231883636,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.113244 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de40499e522bd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.666776253 +0000 UTC m=+3.241521093,LastTimestamp:2026-03-18 09:02:16.666776253 +0000 UTC m=+3.241521093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.117053 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4049a1a8e51 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.670277201 +0000 UTC m=+3.245022041,LastTimestamp:2026-03-18 09:02:16.670277201 +0000 UTC m=+3.245022041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.120552 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4049a27a027 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.671133735 +0000 UTC m=+3.245878575,LastTimestamp:2026-03-18 09:02:16.671133735 +0000 UTC m=+3.245878575,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.125160 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4049c0b6c89 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.702839945 +0000 UTC m=+3.277584785,LastTimestamp:2026-03-18 09:02:16.702839945 +0000 UTC m=+3.277584785,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.127100 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4049c2105ca openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.704255434 +0000 UTC m=+3.279000274,LastTimestamp:2026-03-18 09:02:16.704255434 +0000 UTC m=+3.279000274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.132799 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de404a4fab8b4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.852740276 +0000 UTC m=+3.427485116,LastTimestamp:2026-03-18 09:02:16.852740276 +0000 UTC m=+3.427485116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.138607 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de404a5b9118e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.865214862 +0000 UTC m=+3.439959702,LastTimestamp:2026-03-18 09:02:16.865214862 +0000 UTC m=+3.439959702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.144587 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404a72f62e9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.889746153 +0000 UTC m=+3.464490993,LastTimestamp:2026-03-18 09:02:16.889746153 +0000 UTC m=+3.464490993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.151227 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404a8534dee openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.908877294 +0000 UTC m=+3.483622134,LastTimestamp:2026-03-18 09:02:16.908877294 +0000 UTC m=+3.483622134,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.155676 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404a8682672 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.910243442 +0000 UTC m=+3.484988282,LastTimestamp:2026-03-18 09:02:16.910243442 +0000 UTC m=+3.484988282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.159676 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404b1a88a19 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.065458201 +0000 UTC m=+3.640203051,LastTimestamp:2026-03-18 09:02:17.065458201 +0000 UTC m=+3.640203051,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.164092 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404b294f114 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.08095106 +0000 UTC m=+3.655695890,LastTimestamp:2026-03-18 09:02:17.08095106 +0000 UTC m=+3.655695890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.167524 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404b2a4ef42 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.08199917 +0000 UTC m=+3.656744010,LastTimestamp:2026-03-18 09:02:17.08199917 +0000 UTC m=+3.656744010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.172113 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de404bc639bf8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.245490168 +0000 UTC m=+3.820235038,LastTimestamp:2026-03-18 09:02:17.245490168 +0000 UTC m=+3.820235038,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.176701 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404be39bb3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.276300093 +0000 UTC m=+3.851044923,LastTimestamp:2026-03-18 09:02:17.276300093 +0000 UTC m=+3.851044923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.182097 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404bec744be openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.28557587 +0000 UTC m=+3.860320710,LastTimestamp:2026-03-18 09:02:17.28557587 +0000 UTC m=+3.860320710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.187128 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de404c85827ea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.446066154 +0000 UTC m=+4.020810994,LastTimestamp:2026-03-18 09:02:17.446066154 +0000 UTC m=+4.020810994,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.192142 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de404c92d86b9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.460049593 +0000 UTC m=+4.034794433,LastTimestamp:2026-03-18 09:02:17.460049593 +0000 UTC m=+4.034794433,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.196869 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de404f805ad4a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.245967178 +0000 UTC m=+4.820712018,LastTimestamp:2026-03-18 09:02:18.245967178 +0000 UTC m=+4.820712018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.204860 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40504d2f3f4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.46074674 +0000 UTC m=+5.035491580,LastTimestamp:2026-03-18 09:02:18.46074674 +0000 UTC m=+5.035491580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.211037 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40505837167 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.472313191 +0000 UTC m=+5.047058031,LastTimestamp:2026-03-18 09:02:18.472313191 +0000 UTC m=+5.047058031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.217488 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4050595766c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.473494124 +0000 UTC m=+5.048238974,LastTimestamp:2026-03-18 09:02:18.473494124 +0000 UTC m=+5.048238974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.222519 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40510cdd583 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.661737859 +0000 UTC m=+5.236482699,LastTimestamp:2026-03-18 09:02:18.661737859 +0000 UTC m=+5.236482699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.227609 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de405119c5cfb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.675272955 +0000 UTC m=+5.250017795,LastTimestamp:2026-03-18 09:02:18.675272955 +0000 UTC m=+5.250017795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.230400 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40511af672e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.67652075 +0000 UTC m=+5.251265590,LastTimestamp:2026-03-18 09:02:18.67652075 +0000 UTC m=+5.251265590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.234508 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4051e23e41b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.885481499 +0000 UTC m=+5.460226329,LastTimestamp:2026-03-18 09:02:18.885481499 +0000 UTC m=+5.460226329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.241562 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4051f17d6de openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.901468894 +0000 UTC m=+5.476213764,LastTimestamp:2026-03-18 09:02:18.901468894 +0000 UTC m=+5.476213764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.248520 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4051f306edd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.903080669 +0000 UTC m=+5.477825509,LastTimestamp:2026-03-18 09:02:18.903080669 +0000 UTC m=+5.477825509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.255725 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4052db6764e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:19.146745422 +0000 UTC m=+5.721490272,LastTimestamp:2026-03-18 09:02:19.146745422 +0000 UTC m=+5.721490272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.262352 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4052eb43dbb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:19.163377083 +0000 UTC m=+5.738121933,LastTimestamp:2026-03-18 09:02:19.163377083 +0000 UTC m=+5.738121933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.268934 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4052ec8305c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:19.16468438 +0000 UTC m=+5.739429230,LastTimestamp:2026-03-18 09:02:19.16468438 +0000 UTC m=+5.739429230,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.276665 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4053ba7950b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:19.380651275 +0000 UTC m=+5.955396105,LastTimestamp:2026-03-18 09:02:19.380651275 +0000 UTC m=+5.955396105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.280646 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4053cd659b7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:19.400493495 +0000 UTC m=+5.975238335,LastTimestamp:2026-03-18 09:02:19.400493495 +0000 UTC m=+5.975238335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.290718 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-controller-manager-crc.189de40690a67ac0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 09:03:20 crc kubenswrapper[4778]: body: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:25.101609664 +0000 UTC m=+11.676354544,LastTimestamp:2026-03-18 09:02:25.101609664 +0000 UTC m=+11.676354544,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.295340 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40690a7fba5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:25.101708197 +0000 UTC m=+11.676453077,LastTimestamp:2026-03-18 09:02:25.101708197 +0000 UTC m=+11.676453077,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.299158 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-apiserver-crc.189de4074509819d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:38612->192.168.126.11:17697: read: connection reset by peer Mar 18 09:03:20 crc kubenswrapper[4778]: body: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:28.127998365 +0000 UTC m=+14.702743255,LastTimestamp:2026-03-18 09:02:28.127998365 +0000 UTC m=+14.702743255,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.305155 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de407450ab9f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38612->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:28.128078327 +0000 UTC m=+14.702823207,LastTimestamp:2026-03-18 09:02:28.128078327 +0000 UTC m=+14.702823207,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.313020 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189de404b2a4ef42\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404b2a4ef42 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.08199917 +0000 UTC m=+3.656744010,LastTimestamp:2026-03-18 09:02:28.310736445 +0000 UTC m=+14.885481325,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.320453 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189de404be39bb3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404be39bb3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.276300093 +0000 UTC m=+3.851044923,LastTimestamp:2026-03-18 09:02:28.57501725 +0000 UTC m=+15.149762090,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.326864 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189de404bec744be\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404bec744be openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.28557587 +0000 UTC m=+3.860320710,LastTimestamp:2026-03-18 09:02:28.584920166 +0000 UTC m=+15.159665016,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.334245 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-apiserver-crc.189de407a443dec6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 09:03:20 crc kubenswrapper[4778]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 09:03:20 crc kubenswrapper[4778]: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:29.725658822 +0000 UTC m=+16.300403692,LastTimestamp:2026-03-18 09:02:29.725658822 +0000 UTC m=+16.300403692,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.338879 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de407a446efae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:29.725859758 +0000 UTC m=+16.300604628,LastTimestamp:2026-03-18 09:02:29.725859758 +0000 UTC m=+16.300604628,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.346276 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b15262 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 09:03:20 crc kubenswrapper[4778]: body: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101540962 +0000 UTC m=+21.676285832,LastTimestamp:2026-03-18 09:02:35.101540962 +0000 UTC m=+21.676285832,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.353393 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b24d22 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101605154 +0000 UTC m=+21.676350024,LastTimestamp:2026-03-18 09:02:35.101605154 +0000 UTC m=+21.676350024,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.363600 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de408e4b15262\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b15262 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 09:03:20 crc kubenswrapper[4778]: body: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101540962 +0000 UTC m=+21.676285832,LastTimestamp:2026-03-18 09:02:45.100912731 +0000 UTC m=+31.675657601,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.373605 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de408e4b24d22\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b24d22 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101605154 +0000 UTC m=+21.676350024,LastTimestamp:2026-03-18 09:02:45.100975453 +0000 UTC m=+31.675720333,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.381548 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40b38e3b9a8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:45.104064936 +0000 UTC m=+31.678809836,LastTimestamp:2026-03-18 09:02:45.104064936 +0000 UTC m=+31.678809836,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.389652 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de40448e2d658\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40448e2d658 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.307671128 +0000 UTC m=+1.882415978,LastTimestamp:2026-03-18 09:02:45.225294967 +0000 UTC m=+31.800039837,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.397136 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de4045c2c6264\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4045c2c6264 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.631258212 +0000 UTC m=+2.206003122,LastTimestamp:2026-03-18 09:02:45.463167835 +0000 UTC m=+32.037912685,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.404193 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de4045ce870c1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4045ce870c1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.643582657 +0000 UTC m=+2.218327537,LastTimestamp:2026-03-18 09:02:45.472340251 +0000 UTC m=+32.047085101,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.414361 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de408e4b15262\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b15262 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 09:03:20 crc kubenswrapper[4778]: body: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101540962 +0000 UTC m=+21.676285832,LastTimestamp:2026-03-18 09:02:55.100684971 +0000 UTC m=+41.675429831,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.421089 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de408e4b24d22\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b24d22 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101605154 +0000 UTC m=+21.676350024,LastTimestamp:2026-03-18 09:02:55.100766593 +0000 UTC m=+41.675511473,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.430446 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de408e4b15262\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b15262 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 09:03:20 crc kubenswrapper[4778]: body: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101540962 +0000 UTC m=+21.676285832,LastTimestamp:2026-03-18 09:03:05.100995927 +0000 UTC m=+51.675740777,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:21 crc kubenswrapper[4778]: I0318 09:03:21.108458 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.101083 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.101389 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.107219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.107250 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.107267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.111937 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.116015 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.551344 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.552587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.552636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.552652 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.107278 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.426424 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.426741 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.429529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.429574 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.429591 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.430355 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:03:23 crc kubenswrapper[4778]: E0318 09:03:23.430653 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:03:24 crc kubenswrapper[4778]: I0318 09:03:24.106341 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:24 crc kubenswrapper[4778]: E0318 09:03:24.269276 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:03:25 crc kubenswrapper[4778]: I0318 09:03:25.107136 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:25 crc kubenswrapper[4778]: E0318 09:03:25.157160 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 09:03:25 crc kubenswrapper[4778]: I0318 09:03:25.170777 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:25 crc kubenswrapper[4778]: I0318 09:03:25.172113 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:25 crc kubenswrapper[4778]: I0318 09:03:25.172157 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:25 crc kubenswrapper[4778]: I0318 09:03:25.172172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:25 crc kubenswrapper[4778]: I0318 09:03:25.172214 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:03:25 crc kubenswrapper[4778]: E0318 09:03:25.176512 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.107955 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.186867 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.188567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.188698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.188775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.794344 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.794886 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.796259 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.796290 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.796299 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:27 crc kubenswrapper[4778]: I0318 09:03:27.092223 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 09:03:27 crc kubenswrapper[4778]: I0318 09:03:27.106398 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:27 crc kubenswrapper[4778]: I0318 09:03:27.109742 4778 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 09:03:27 crc kubenswrapper[4778]: W0318 09:03:27.416140 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 09:03:27 crc kubenswrapper[4778]: E0318 09:03:27.416524 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 09:03:28 crc kubenswrapper[4778]: I0318 09:03:28.107958 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:29 crc kubenswrapper[4778]: I0318 09:03:29.058928 4778 csr.go:261] certificate signing request csr-cbhwp is approved, waiting to be issued Mar 18 09:03:29 crc kubenswrapper[4778]: I0318 09:03:29.067231 4778 csr.go:257] certificate signing request csr-cbhwp is issued Mar 18 09:03:29 crc kubenswrapper[4778]: I0318 09:03:29.120784 4778 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 09:03:29 crc kubenswrapper[4778]: I0318 09:03:29.707686 4778 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 09:03:29 crc kubenswrapper[4778]: I0318 09:03:29.939011 4778 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 09:03:29 crc kubenswrapper[4778]: W0318 09:03:29.939282 4778 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.068851 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-18 13:20:11.739375083 +0000 UTC Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.068902 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5884h16m41.670477987s for next certificate rotation Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.118237 4778 apiserver.go:52] "Watching apiserver" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.127624 4778 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.127907 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.128481 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.128542 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.128562 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.128665 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.128797 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.128825 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.129098 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.129424 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.129480 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.131017 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.131081 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.131615 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.131767 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.132607 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.132697 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.133765 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.134117 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.134145 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.166931 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.182096 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.196566 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.204146 4778 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.213380 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228517 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228747 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228817 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228858 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228903 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228940 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228980 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229014 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229048 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229129 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229168 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229237 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229286 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229293 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229323 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229359 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229399 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229432 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229467 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229506 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229545 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229584 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229619 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229656 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229690 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229724 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229758 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229791 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229897 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229910 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229938 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229948 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230019 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230078 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230138 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230155 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230190 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230288 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230298 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230339 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230344 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230443 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230484 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230519 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230525 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230584 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230613 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230639 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230661 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230687 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230731 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230753 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230776 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230799 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230821 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230862 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230873 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230886 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230950 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230989 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231029 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231067 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231090 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231102 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231138 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231122 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231175 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231246 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231285 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231319 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231353 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231392 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231433 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231469 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231504 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231538 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231541 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231594 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231610 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231618 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231667 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231693 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231716 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231742 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231735 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231771 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231796 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231817 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231829 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231864 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231864 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231889 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231913 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231937 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231961 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231943 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231985 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232013 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232034 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232038 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232080 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232105 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232129 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232149 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232171 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232218 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232244 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232268 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232292 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232315 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232338 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232360 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232386 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232406 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232452 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232484 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232507 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232529 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232550 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232570 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232591 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232611 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232639 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232660 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232681 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232702 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232711 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232724 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232779 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232808 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232837 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232862 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232890 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232918 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232945 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232972 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232983 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232997 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232999 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233024 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233052 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233107 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233131 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233152 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233174 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233219 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233243 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233268 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233291 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233315 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233336 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233359 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233382 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233404 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233428 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233451 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233474 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233495 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233518 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233541 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233563 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233587 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233610 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233632 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233654 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233677 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233701 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233723 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233743 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233768 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233792 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233817 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233863 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233885 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233881 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233913 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233899 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233936 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233980 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234028 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234098 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234235 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234302 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234360 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234417 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234471 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234532 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234576 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234613 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234628 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234634 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234659 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234709 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234715 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234760 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234763 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234822 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234851 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234846 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234915 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235256 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234878 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235391 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235467 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235527 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235710 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235775 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235831 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235887 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235938 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235992 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236049 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236106 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236164 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236328 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236383 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236449 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236505 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236557 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236617 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236719 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236783 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236831 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236889 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236978 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237036 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237127 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237243 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237310 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237372 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237418 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237460 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237502 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237550 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237643 4778 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237672 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237697 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237721 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237743 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237766 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237790 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237812 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237833 4778 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237854 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237874 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237896 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237918 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237939 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237960 4778 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237981 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238002 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238026 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238052 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238087 4778 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238109 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238136 4778 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238169 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238239 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238266 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238288 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238310 4778 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238334 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238356 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238385 4778 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238407 4778 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238433 4778 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235375 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235466 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235586 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235954 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235982 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236097 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236135 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237295 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237480 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238100 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238253 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238635 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.239001 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.239676 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.240398 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.240440 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.240561 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.241055 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.241242 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.241271 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.241318 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.241339 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.241346 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.241522 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:03:30.741477675 +0000 UTC m=+77.316222565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.243715 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.244165 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.244520 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.244568 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.244731 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.244970 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.245456 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.245707 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.245799 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.245819 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.245904 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.245550 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246070 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246096 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246432 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246488 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246515 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246566 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246588 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246925 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246981 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.247042 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.247012 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.247168 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.247481 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.247506 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.247647 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.248453 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.249777 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.249792 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250137 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250401 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250458 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250517 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250721 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250779 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250900 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.251082 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.251260 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.251283 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.251433 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.251592 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.251916 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.252806 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.252845 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.252991 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.253365 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.253400 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.253417 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.253842 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.254295 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.254514 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.254743 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.254855 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.255317 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.256499 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.256688 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.256793 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.257391 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.257457 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.257824 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.258305 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.258094 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.258648 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.258905 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.259009 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:30.75896998 +0000 UTC m=+77.333714830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.259044 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.259566 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.260052 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.260065 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.259651 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.259710 4778 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.260836 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.261341 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.261436 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.262444 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.263245 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.263646 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.263917 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.264136 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.264163 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.264589 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.264750 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.265575 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.267119 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.267378 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.267653 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.267976 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.268005 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.268087 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:30.768070401 +0000 UTC m=+77.342815251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.268128 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.268419 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.268612 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.268863 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.269711 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.269920 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.270113 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.270151 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.270803 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.271507 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.273457 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.273505 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.273532 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.273659 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:30.773627819 +0000 UTC m=+77.348372699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.277217 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.280037 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.282399 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.282439 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.282462 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.282545 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:30.782519956 +0000 UTC m=+77.357264806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.283056 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.283260 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.283758 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.285066 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.285536 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.286017 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.286089 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.286040 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.286477 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.286382 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.286457 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.287096 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.287145 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.287170 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.287446 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.287236 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.287872 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.288059 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.288566 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.289370 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.289846 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.290549 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.290790 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.290943 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.291412 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.291625 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.291950 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.292087 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.292130 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.292609 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.292707 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.293494 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.293496 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.293531 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.293843 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.293908 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.293969 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.294008 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.294163 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.294463 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.294850 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.294967 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.295445 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.295457 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.295637 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.297629 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.298209 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.307614 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.312572 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.322054 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.327371 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.329848 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.332049 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339319 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339383 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339475 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339493 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339507 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339523 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339536 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339550 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339563 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339576 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339592 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339608 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339624 4778 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339637 4778 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339649 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339662 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339675 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339687 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339591 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339700 4778 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339761 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339817 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339831 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339845 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339862 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339880 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339897 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339911 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339923 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339937 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339950 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339964 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339979 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339992 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340005 4778 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340018 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340030 4778 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340043 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340055 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340070 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340084 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340100 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340114 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340128 4778 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340141 4778 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340154 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340166 4778 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340179 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340192 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340227 4778 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340240 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340253 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340265 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340278 4778 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340290 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340302 4778 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340316 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340329 4778 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340341 4778 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340355 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340368 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340379 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340393 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340409 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340422 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340434 4778 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340447 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340459 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340470 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340482 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340494 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340505 4778 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340517 4778 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340529 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340541 4778 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340553 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340566 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340578 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340591 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340606 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340620 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340632 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340644 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340656 4778 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340669 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340681 4778 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340693 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340707 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340722 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340736 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340751 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340764 4778 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340777 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340791 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340803 4778 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340815 4778 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340828 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340839 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340852 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340868 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340886 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340900 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340912 4778 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340924 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340937 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340949 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340961 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340973 4778 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340986 4778 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340999 4778 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341011 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341022 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341034 4778 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341045 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341057 4778 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341069 4778 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341080 4778 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341091 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341103 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341147 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341160 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341172 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341184 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341214 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341227 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341242 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341254 4778 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341267 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341278 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341290 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341302 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341315 4778 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341328 4778 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341340 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341354 4778 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341366 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341379 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341392 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341404 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341417 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341431 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341444 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341456 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341469 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341481 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341494 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341506 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341518 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341531 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341543 4778 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341555 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341567 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341616 4778 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341630 4778 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341644 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341658 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341671 4778 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341684 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341697 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341710 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341722 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341734 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341749 4778 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341761 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341772 4778 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.451103 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.469751 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.478016 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:30 crc kubenswrapper[4778]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 09:03:30 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 09:03:30 crc kubenswrapper[4778]: source /etc/kubernetes/apiserver-url.env Mar 18 09:03:30 crc kubenswrapper[4778]: else Mar 18 09:03:30 crc kubenswrapper[4778]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 09:03:30 crc kubenswrapper[4778]: exit 1 Mar 18 09:03:30 crc kubenswrapper[4778]: fi Mar 18 09:03:30 crc kubenswrapper[4778]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 09:03:30 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:30 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.479268 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.483456 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: W0318 09:03:30.485569 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-0a13c0f6f9114eaf03952727eeb70b44491688bc42826224aea18889d06b7473 WatchSource:0}: Error finding container 0a13c0f6f9114eaf03952727eeb70b44491688bc42826224aea18889d06b7473: Status 404 returned error can't find the container with id 0a13c0f6f9114eaf03952727eeb70b44491688bc42826224aea18889d06b7473 Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.491488 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:30 crc kubenswrapper[4778]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:30 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:30 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:30 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: fi Mar 18 09:03:30 crc kubenswrapper[4778]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 09:03:30 crc kubenswrapper[4778]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 09:03:30 crc kubenswrapper[4778]: ho_enable="--enable-hybrid-overlay" Mar 18 09:03:30 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 09:03:30 crc kubenswrapper[4778]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 09:03:30 crc kubenswrapper[4778]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 09:03:30 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:30 crc kubenswrapper[4778]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --webhook-host=127.0.0.1 \ Mar 18 09:03:30 crc kubenswrapper[4778]: --webhook-port=9743 \ Mar 18 09:03:30 crc kubenswrapper[4778]: ${ho_enable} \ Mar 18 09:03:30 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:03:30 crc kubenswrapper[4778]: --disable-approver \ Mar 18 09:03:30 crc kubenswrapper[4778]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --wait-for-kubernetes-api=200s \ Mar 18 09:03:30 crc kubenswrapper[4778]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:30 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:30 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.495050 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:30 crc kubenswrapper[4778]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:30 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:30 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:30 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: fi Mar 18 09:03:30 crc kubenswrapper[4778]: Mar 18 09:03:30 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 09:03:30 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:30 crc kubenswrapper[4778]: --disable-webhook \ Mar 18 09:03:30 crc kubenswrapper[4778]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:30 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:30 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: W0318 09:03:30.495661 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-b77fa638174657854f588becf949f187b6c3bc1b97dc647bce430433e94d60ff WatchSource:0}: Error finding container b77fa638174657854f588becf949f187b6c3bc1b97dc647bce430433e94d60ff: Status 404 returned error can't find the container with id b77fa638174657854f588becf949f187b6c3bc1b97dc647bce430433e94d60ff Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.496594 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.499392 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.500585 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.577022 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0a13c0f6f9114eaf03952727eeb70b44491688bc42826224aea18889d06b7473"} Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.579126 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"31e3fc40d1b9e93db517675fadb95d616ebb6f222bec7672ed9fd7398ad8be72"} Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.580169 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:30 crc kubenswrapper[4778]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:30 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:30 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:30 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: fi Mar 18 09:03:30 crc kubenswrapper[4778]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 09:03:30 crc kubenswrapper[4778]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 09:03:30 crc kubenswrapper[4778]: ho_enable="--enable-hybrid-overlay" Mar 18 09:03:30 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 09:03:30 crc kubenswrapper[4778]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 09:03:30 crc kubenswrapper[4778]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 09:03:30 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:30 crc kubenswrapper[4778]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --webhook-host=127.0.0.1 \ Mar 18 09:03:30 crc kubenswrapper[4778]: --webhook-port=9743 \ Mar 18 09:03:30 crc kubenswrapper[4778]: ${ho_enable} \ Mar 18 09:03:30 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:03:30 crc kubenswrapper[4778]: --disable-approver \ Mar 18 09:03:30 crc kubenswrapper[4778]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --wait-for-kubernetes-api=200s \ Mar 18 09:03:30 crc kubenswrapper[4778]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:30 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:30 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.581770 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b77fa638174657854f588becf949f187b6c3bc1b97dc647bce430433e94d60ff"} Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.582077 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:30 crc kubenswrapper[4778]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 09:03:30 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 09:03:30 crc kubenswrapper[4778]: source /etc/kubernetes/apiserver-url.env Mar 18 09:03:30 crc kubenswrapper[4778]: else Mar 18 09:03:30 crc kubenswrapper[4778]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 09:03:30 crc kubenswrapper[4778]: exit 1 Mar 18 09:03:30 crc kubenswrapper[4778]: fi Mar 18 09:03:30 crc kubenswrapper[4778]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 09:03:30 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:30 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.583342 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.583794 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:30 crc kubenswrapper[4778]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:30 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:30 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:30 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: fi Mar 18 09:03:30 crc kubenswrapper[4778]: Mar 18 09:03:30 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 09:03:30 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:30 crc kubenswrapper[4778]: --disable-webhook \ Mar 18 09:03:30 crc kubenswrapper[4778]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:30 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:30 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.584931 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.584981 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.586785 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.597599 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.614994 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.629720 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.646804 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.658461 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.668424 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.679957 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.691026 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.702642 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.714267 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.725906 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.738331 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.745524 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.745787 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:03:31.745744932 +0000 UTC m=+78.320489782 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.846764 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.846815 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.846836 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.846855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847023 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847073 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847095 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847119 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847141 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:31.847114487 +0000 UTC m=+78.421859337 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847159 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847184 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847130 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847273 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:31.847253381 +0000 UTC m=+78.421998261 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847304 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847305 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:31.847290142 +0000 UTC m=+78.422035022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847353 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:31.847334933 +0000 UTC m=+78.422079773 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: I0318 09:03:31.758770 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.758969 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:03:33.758928231 +0000 UTC m=+80.333673101 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:03:31 crc kubenswrapper[4778]: I0318 09:03:31.859646 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:31 crc kubenswrapper[4778]: I0318 09:03:31.859744 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:31 crc kubenswrapper[4778]: I0318 09:03:31.859815 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.859863 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.859982 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:33.859947606 +0000 UTC m=+80.434692486 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860000 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860045 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860074 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:33.860050979 +0000 UTC m=+80.434795859 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860090 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860115 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860185 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:33.860161442 +0000 UTC m=+80.434906322 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860235 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860280 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860314 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:31 crc kubenswrapper[4778]: I0318 09:03:31.859868 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860373 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:33.860352977 +0000 UTC m=+80.435097857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.176617 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.178812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.178893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.178919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.179024 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.186904 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.186936 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.187070 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.187139 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.187383 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.187509 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.191990 4778 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.192183 4778 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.193971 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.194020 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.194037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.194063 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.194084 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.197822 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.199104 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.201468 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.203625 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.206312 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.208185 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.209747 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.212130 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.213912 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.216271 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.217655 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.220301 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.221713 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.223138 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.224232 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.225293 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.226559 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.229000 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.230080 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.231637 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.233937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.234037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.234133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.234191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.234278 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.235180 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.236446 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.237988 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.239842 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.241683 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.243900 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.245434 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.247930 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.248998 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.251364 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.252395 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.253839 4778 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.254122 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.256665 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.257728 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.258968 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.260099 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.263064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.263142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.263170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.263239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.263267 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.265720 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.269284 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.270805 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.272624 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.275164 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.276442 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.278144 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.278671 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.280240 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.281541 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.283918 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.283965 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.283980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.284001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.284016 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.284101 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.285622 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.287748 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.288668 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.289723 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.290350 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.291330 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.291907 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.292532 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.293714 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.303000 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.309295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.309343 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.309361 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.309387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.309405 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.326266 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.326642 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.328533 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.328588 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.328601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.328624 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.328638 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.431944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.432082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.432102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.432129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.432176 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.534665 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.534720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.534744 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.534768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.534787 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.639732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.639803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.639822 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.639853 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.639873 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.743127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.743247 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.743270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.743486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.743515 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.847378 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.847440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.847450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.847469 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.847482 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.952026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.952101 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.952129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.952164 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.952191 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.055401 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.055469 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.055487 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.055510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.055529 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.159530 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.159600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.159620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.159646 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.159665 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.263865 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.263934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.263953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.263985 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.264004 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.368989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.369071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.369096 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.369133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.369158 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.473840 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.473931 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.473951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.473980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.474002 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.577723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.577794 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.577813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.577843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.577863 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.681442 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.681524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.681542 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.681569 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.681587 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.779904 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.780122 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:03:37.780092048 +0000 UTC m=+84.354836898 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.785409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.785486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.785504 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.785529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.785547 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.881030 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.881098 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.881125 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.881150 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881161 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881287 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:37.881257648 +0000 UTC m=+84.456002508 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881316 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881362 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881395 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881417 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881372 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:37.8813558 +0000 UTC m=+84.456100660 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881452 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881507 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:37.881474683 +0000 UTC m=+84.456219533 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881529 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881557 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881653 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:37.881625307 +0000 UTC m=+84.456370177 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.888375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.888413 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.888426 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.888443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.888456 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.991169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.991263 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.991280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.991302 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.991319 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.094075 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.094430 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.094757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.094864 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.094961 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.187102 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.187251 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:34 crc kubenswrapper[4778]: E0318 09:03:34.187436 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.187572 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:34 crc kubenswrapper[4778]: E0318 09:03:34.187868 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:34 crc kubenswrapper[4778]: E0318 09:03:34.188082 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.197589 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.197894 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.198016 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.198089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.198150 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.204908 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.221399 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.237895 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.254302 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.269397 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.294530 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.300848 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.300910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.300928 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.300951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.300968 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.403762 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.403812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.403828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.403855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.403875 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.506483 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.506550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.506573 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.506600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.506621 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.609129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.609193 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.609257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.609285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.609307 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.711637 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.711690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.711708 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.711730 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.711747 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.814454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.814509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.814532 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.814560 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.814583 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.918114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.918189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.918281 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.918353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.918384 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.021324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.021455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.021480 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.021510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.021532 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.124696 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.124750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.124762 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.124783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.124796 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.208967 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.227169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.227271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.227290 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.227314 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.227335 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.330917 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.330971 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.330989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.331011 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.331033 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.434866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.434955 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.434980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.435014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.435038 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.538299 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.538429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.538454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.538489 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.538516 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.640695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.640741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.640753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.640770 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.640783 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.743129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.743238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.743251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.743270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.743282 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.846474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.846543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.846570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.846601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.846623 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.950251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.950328 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.950351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.950383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.950406 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.054511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.055015 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.055238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.055474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.055713 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.158828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.158873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.158882 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.158925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.158939 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.186507 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.186597 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:36 crc kubenswrapper[4778]: E0318 09:03:36.186684 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.186825 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:36 crc kubenswrapper[4778]: E0318 09:03:36.186984 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:36 crc kubenswrapper[4778]: E0318 09:03:36.187290 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.261655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.261709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.261724 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.261742 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.261755 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.364712 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.364782 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.364796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.364837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.364849 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.467344 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.467396 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.467404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.467418 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.467428 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.570080 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.570186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.570237 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.570264 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.570283 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.673269 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.673333 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.673353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.673380 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.673397 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.776339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.776398 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.776416 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.776440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.776457 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.881414 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.881723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.881817 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.881916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.882003 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.915327 4778 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.984709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.984741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.984753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.984768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.984779 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.087633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.087701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.087728 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.087775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.087790 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.190775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.190846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.190867 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.190899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.190918 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.201653 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.202666 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.203017 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.294900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.294989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.295011 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.295042 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.295067 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.397661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.397732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.397754 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.397783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.397806 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.500239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.500311 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.500329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.500355 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.500372 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.603372 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.603450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.603477 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.603514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.603539 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.604073 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.604475 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.706770 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.706850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.706871 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.706902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.706924 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.809911 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.810357 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.810531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.810687 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.810831 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.815599 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.815900 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:03:45.81586917 +0000 UTC m=+92.390614040 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.914774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.914880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.914898 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.914922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.914940 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.916361 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.916632 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.916542 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.916877 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.916986 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:45.916954438 +0000 UTC m=+92.491699318 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.916859 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917136 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917188 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917322 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917379 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:45.917350868 +0000 UTC m=+92.492095738 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917467 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:45.917453421 +0000 UTC m=+92.492198291 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.917118 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917579 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917601 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917618 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917690 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:45.917667817 +0000 UTC m=+92.492412857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.018574 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.018679 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.018702 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.018725 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.018742 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.121754 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.121805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.121817 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.121835 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.121847 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.187041 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:38 crc kubenswrapper[4778]: E0318 09:03:38.187259 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.187502 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.187577 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:38 crc kubenswrapper[4778]: E0318 09:03:38.187711 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:38 crc kubenswrapper[4778]: E0318 09:03:38.187893 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.224897 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.224959 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.224983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.225008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.225027 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.328367 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.328689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.328878 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.329046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.329222 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.432160 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.432304 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.432323 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.432346 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.432366 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.535662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.535720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.535740 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.535763 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.535778 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.639174 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.639296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.639324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.639350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.639371 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.742876 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.742954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.742976 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.743006 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.743029 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.845448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.845491 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.845500 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.845514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.845527 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.948613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.948695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.948723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.948757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.948780 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.051604 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.051687 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.051705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.051738 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.051761 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.155621 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.155681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.155692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.155712 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.155725 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.258905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.258968 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.258978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.258995 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.259008 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.361643 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.361717 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.361734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.361759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.361779 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.464250 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.464308 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.464326 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.464352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.464370 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.567597 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.567644 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.567654 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.567668 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.567678 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.669849 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.670118 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.670249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.670352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.670436 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.774026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.774089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.774107 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.774157 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.774174 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.877314 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.877607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.877683 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.877750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.877819 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.980698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.981166 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.981410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.981617 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.981825 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.085467 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.086286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.086316 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.086340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.086357 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.186497 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:40 crc kubenswrapper[4778]: E0318 09:03:40.186738 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.186505 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.186858 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:40 crc kubenswrapper[4778]: E0318 09:03:40.186909 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:40 crc kubenswrapper[4778]: E0318 09:03:40.188095 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.189142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.189270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.189296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.189375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.189451 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.292291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.292347 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.292364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.292387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.292404 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.395355 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.395409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.395429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.395453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.395472 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.497976 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.498025 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.498036 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.498058 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.498071 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.601559 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.601619 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.601635 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.601662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.601677 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.704826 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.704885 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.704897 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.704919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.704934 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.808155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.808277 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.808296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.808324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.808343 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.911409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.911471 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.911483 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.911502 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.911515 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.014640 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.014689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.014699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.014718 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.014728 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.122829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.122905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.122923 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.122951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.122968 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.226244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.226312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.226325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.226350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.226364 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.329872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.329955 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.329973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.330000 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.330033 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.432988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.433054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.433071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.433096 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.433118 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.535825 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.535880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.535896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.535920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.535937 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.639173 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.639283 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.639302 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.639327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.639347 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.742342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.742410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.742428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.742454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.742473 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.845121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.845183 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.845227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.845252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.845269 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.948098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.948145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.948162 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.948185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.948236 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.051270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.051343 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.051365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.051390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.051409 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.154327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.154371 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.154379 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.154393 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.154404 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.186566 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.186645 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.186840 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.186860 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.187012 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.187178 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.256927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.257020 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.257045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.257071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.257091 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.360701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.360796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.360816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.360843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.360862 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.362678 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.362739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.362759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.362787 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.362805 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.381125 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.386560 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.386617 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.386635 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.386661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.386679 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.403359 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.407866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.407899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.407907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.407921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.407931 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.422731 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.427796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.427862 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.427881 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.427912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.427931 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.444103 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.448937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.449082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.449167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.449297 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.449403 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.461166 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.461906 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.464114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.464179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.464229 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.464257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.464275 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.567044 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.567075 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.567082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.567095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.567106 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.674101 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.674144 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.674154 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.674169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.674185 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.777701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.777779 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.777804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.777837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.777863 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.880704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.880760 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.880816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.880839 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.880855 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.983534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.983575 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.983587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.983605 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.983617 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.086972 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.087047 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.087066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.087092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.087110 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: E0318 09:03:43.188679 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:43 crc kubenswrapper[4778]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 09:03:43 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:43 crc kubenswrapper[4778]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 09:03:43 crc kubenswrapper[4778]: source /etc/kubernetes/apiserver-url.env Mar 18 09:03:43 crc kubenswrapper[4778]: else Mar 18 09:03:43 crc kubenswrapper[4778]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 09:03:43 crc kubenswrapper[4778]: exit 1 Mar 18 09:03:43 crc kubenswrapper[4778]: fi Mar 18 09:03:43 crc kubenswrapper[4778]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 09:03:43 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:43 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:43 crc kubenswrapper[4778]: E0318 09:03:43.189362 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:43 crc kubenswrapper[4778]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:43 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:43 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:43 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:43 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:43 crc kubenswrapper[4778]: fi Mar 18 09:03:43 crc kubenswrapper[4778]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 09:03:43 crc kubenswrapper[4778]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 09:03:43 crc kubenswrapper[4778]: ho_enable="--enable-hybrid-overlay" Mar 18 09:03:43 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 09:03:43 crc kubenswrapper[4778]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 09:03:43 crc kubenswrapper[4778]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 09:03:43 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:43 crc kubenswrapper[4778]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 09:03:43 crc kubenswrapper[4778]: --webhook-host=127.0.0.1 \ Mar 18 09:03:43 crc kubenswrapper[4778]: --webhook-port=9743 \ Mar 18 09:03:43 crc kubenswrapper[4778]: ${ho_enable} \ Mar 18 09:03:43 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:03:43 crc kubenswrapper[4778]: --disable-approver \ Mar 18 09:03:43 crc kubenswrapper[4778]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 09:03:43 crc kubenswrapper[4778]: --wait-for-kubernetes-api=200s \ Mar 18 09:03:43 crc kubenswrapper[4778]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 09:03:43 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:43 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:43 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.189516 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.189536 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.189547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.189564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.189576 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: E0318 09:03:43.190360 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 09:03:43 crc kubenswrapper[4778]: E0318 09:03:43.191112 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:43 crc kubenswrapper[4778]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:43 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:43 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:43 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:43 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:43 crc kubenswrapper[4778]: fi Mar 18 09:03:43 crc kubenswrapper[4778]: Mar 18 09:03:43 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 09:03:43 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:43 crc kubenswrapper[4778]: --disable-webhook \ Mar 18 09:03:43 crc kubenswrapper[4778]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 09:03:43 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:43 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:43 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:43 crc kubenswrapper[4778]: E0318 09:03:43.192341 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.291676 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.291720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.291731 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.291750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.291764 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.394788 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.394833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.394843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.394858 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.394870 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.497554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.497645 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.497666 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.497692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.497711 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.599765 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.599844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.599862 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.599903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.599922 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.702692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.702758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.702777 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.702802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.702821 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.806165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.806260 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.806280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.806313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.806332 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.909424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.909513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.909536 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.909566 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.909587 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.012938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.013404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.013429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.013459 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.013483 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.115587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.115645 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.115669 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.115697 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.115720 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.186609 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.186656 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:44 crc kubenswrapper[4778]: E0318 09:03:44.186777 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.186799 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:44 crc kubenswrapper[4778]: E0318 09:03:44.186986 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:44 crc kubenswrapper[4778]: E0318 09:03:44.187114 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:44 crc kubenswrapper[4778]: E0318 09:03:44.189303 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:03:44 crc kubenswrapper[4778]: E0318 09:03:44.190452 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.206821 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.218302 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.218642 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.218739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.218844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.218927 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.219562 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.231844 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.246944 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.262230 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.286473 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.298124 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.309992 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.322108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.322143 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.322157 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.322221 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.322235 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.425232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.425279 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.425288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.425305 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.425323 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.528024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.528066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.528074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.528088 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.528097 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.630391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.630432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.630442 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.630457 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.630467 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.733564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.733629 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.733646 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.733670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.733687 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.836022 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.836073 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.836091 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.836114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.836131 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.938757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.939099 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.939165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.939285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.939376 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.042339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.042402 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.042420 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.042444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.042461 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.145383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.145446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.145463 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.145485 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.145509 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.248715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.248764 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.248774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.248792 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.248804 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.351526 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.351594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.351613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.351638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.351655 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.455359 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.455430 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.455455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.455478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.455495 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.557711 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.557756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.557767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.557785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.557796 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.659878 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.659933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.659950 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.659973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.659991 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.762293 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.762339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.762347 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.762361 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.762370 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.864913 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.864946 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.864954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.864967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.864976 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.896238 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.896477 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:04:01.896439463 +0000 UTC m=+108.471184303 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.968324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.968375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.968388 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.968406 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.968418 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.996957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.997011 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.997044 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.997069 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997213 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997223 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997236 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997241 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997314 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:01.997291474 +0000 UTC m=+108.572036394 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997251 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997249 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997420 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:01.997391237 +0000 UTC m=+108.572136087 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997314 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997450 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997512 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:01.99749567 +0000 UTC m=+108.572240520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997544 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:01.9975219 +0000 UTC m=+108.572266780 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.071701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.071813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.071843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.071883 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.071932 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.174575 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.174656 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.174668 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.174695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.174710 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.186869 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.186986 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.186924 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:46 crc kubenswrapper[4778]: E0318 09:03:46.187052 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:46 crc kubenswrapper[4778]: E0318 09:03:46.187148 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:46 crc kubenswrapper[4778]: E0318 09:03:46.187467 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.277432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.277476 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.277494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.277516 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.277535 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.380675 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.380751 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.380774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.380803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.380838 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.483136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.483242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.483267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.483296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.483318 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.586348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.586459 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.586478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.586510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.586532 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.689921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.689988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.690007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.690036 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.690051 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.792705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.792741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.792752 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.792769 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.792781 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.895586 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.895625 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.895637 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.895653 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.895667 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.998567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.998621 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.998639 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.998653 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.998662 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.101480 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.101587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.101606 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.101629 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.101643 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.205337 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.205404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.205423 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.205453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.205473 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.307633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.307672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.307727 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.307746 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.307760 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.351577 4778 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.410274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.410607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.410699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.410846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.410942 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.513634 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.513941 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.514102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.514296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.514443 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.617440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.617506 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.617522 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.617545 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.617563 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.720918 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.721478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.721681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.721865 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.722030 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.825558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.825603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.825616 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.825641 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.825653 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.928622 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.928699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.928718 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.928747 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.928766 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.031676 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.032045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.032274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.032460 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.032609 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.135821 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.135874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.135884 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.135904 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.135915 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.186497 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.186570 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.186700 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:48 crc kubenswrapper[4778]: E0318 09:03:48.186688 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:48 crc kubenswrapper[4778]: E0318 09:03:48.187007 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:48 crc kubenswrapper[4778]: E0318 09:03:48.187077 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.240089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.240168 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.240187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.240248 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.240268 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.344291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.344377 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.344397 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.344433 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.344454 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.448092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.448236 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.448257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.448289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.448308 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.551813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.551863 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.551879 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.551904 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.551920 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.654531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.654590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.654607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.654631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.654648 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.757189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.757275 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.757306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.757330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.757349 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.861269 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.861348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.861365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.861391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.861410 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.963708 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.963750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.963761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.963781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.963792 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.067328 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.067424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.067449 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.067479 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.067497 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.170064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.170147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.170162 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.170181 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.170209 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.273146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.273249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.273270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.273297 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.273318 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.375972 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.376059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.376077 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.376112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.376132 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.478904 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.478982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.479004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.479028 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.479046 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.582814 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.582878 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.582901 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.582925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.582943 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.685744 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.685795 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.685807 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.685824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.685837 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.788870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.788941 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.788957 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.788987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.789012 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.891327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.891403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.891420 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.891445 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.891463 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.994145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.994185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.994217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.994237 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.994251 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.097401 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.097442 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.097454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.097472 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.097485 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.186779 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.186895 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:50 crc kubenswrapper[4778]: E0318 09:03:50.186984 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.186998 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:50 crc kubenswrapper[4778]: E0318 09:03:50.187136 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:50 crc kubenswrapper[4778]: E0318 09:03:50.187269 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.200853 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.200923 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.200943 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.200966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.200984 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.303043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.303086 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.303105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.303128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.303145 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.405651 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.405709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.405723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.405745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.405760 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.510129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.510391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.510723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.511026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.511396 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.614187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.614571 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.614715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.614847 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.614971 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.717855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.718103 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.718353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.718566 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.718767 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.821760 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.821847 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.821870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.821902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.821924 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.928569 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.928629 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.928645 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.928669 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.928711 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.032220 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.032263 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.032276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.032295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.032308 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.135689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.135747 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.135773 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.135800 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.135820 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.239255 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.239320 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.239342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.239371 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.239392 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.342536 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.342953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.343159 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.343431 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.343604 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.447012 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.447074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.447091 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.447119 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.447136 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.550743 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.551046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.551107 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.551167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.551266 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.654581 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.654638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.654655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.654676 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.654694 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.758189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.758266 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.758279 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.758303 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.758317 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.861681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.862115 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.862316 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.862500 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.862680 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.966329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.966402 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.966421 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.966446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.966463 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.069491 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.069582 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.069595 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.069612 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.069628 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.173019 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.173064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.173079 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.173102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.173115 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.186619 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.186700 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.186862 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.187016 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.187171 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.187289 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.187474 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.187796 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.276461 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.276518 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.276541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.276569 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.276592 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.379755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.379802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.379812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.379828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.379837 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.483005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.483092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.483111 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.483668 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.483723 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.587634 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.587711 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.587734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.587763 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.587785 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.625279 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.625320 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.625332 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.625349 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.625362 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.640707 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.645554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.645594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.645605 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.645622 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.645634 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.659774 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.663954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.663983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.663991 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.664005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.664013 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.674875 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.679479 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.679511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.679520 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.679534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.679545 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.693077 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.697499 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.697564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.697587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.697617 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.697639 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.711740 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.711980 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.714454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.714503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.714521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.714543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.714560 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.817588 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.817658 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.817682 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.817710 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.817732 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.920662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.920690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.920700 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.920714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.920725 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.023856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.023933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.023956 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.023988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.024013 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.127727 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.127879 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.127905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.127935 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.127958 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.231327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.231396 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.231419 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.231447 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.231469 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.334601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.334736 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.334767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.334809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.334849 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.438366 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.438436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.438462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.438493 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.438516 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.542239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.542318 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.542339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.542365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.542386 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.644652 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.644689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.644697 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.644710 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.644738 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.747450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.747714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.747927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.748099 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.748337 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.853030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.853114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.853137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.853169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.853233 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.956618 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.956690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.956714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.956745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.956768 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.060410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.060495 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.060521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.060552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.060570 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.164060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.164141 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.164167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.164220 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.164306 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.187141 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.187237 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:54 crc kubenswrapper[4778]: E0318 09:03:54.187339 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:54 crc kubenswrapper[4778]: E0318 09:03:54.187440 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.187034 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:54 crc kubenswrapper[4778]: E0318 09:03:54.187918 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.205227 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.218268 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.230962 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.245384 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.261492 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.267389 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.267444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.267463 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.267484 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.267499 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.282530 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.302983 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.313904 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.370940 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.371004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.371033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.371063 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.371086 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.474373 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.474440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.474462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.474527 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.474550 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.577267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.577333 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.577348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.577368 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.577383 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.679843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.679899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.679910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.679930 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.679943 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.782936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.783372 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.783502 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.783657 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.783767 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.886142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.886185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.886215 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.886235 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.886247 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.989350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.989414 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.989432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.989457 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.989476 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.092633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.092695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.092714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.092743 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.092761 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: E0318 09:03:55.189607 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:03:55 crc kubenswrapper[4778]: E0318 09:03:55.189712 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:55 crc kubenswrapper[4778]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 09:03:55 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:55 crc kubenswrapper[4778]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 09:03:55 crc kubenswrapper[4778]: source /etc/kubernetes/apiserver-url.env Mar 18 09:03:55 crc kubenswrapper[4778]: else Mar 18 09:03:55 crc kubenswrapper[4778]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 09:03:55 crc kubenswrapper[4778]: exit 1 Mar 18 09:03:55 crc kubenswrapper[4778]: fi Mar 18 09:03:55 crc kubenswrapper[4778]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 09:03:55 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:55 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:55 crc kubenswrapper[4778]: E0318 09:03:55.190809 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 09:03:55 crc kubenswrapper[4778]: E0318 09:03:55.191009 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.194713 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.194796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.194816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.194839 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.194857 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.297720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.297771 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.297783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.297803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.297818 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.400821 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.400894 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.400905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.400920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.400930 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.504119 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.504248 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.504276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.504313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.504337 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.607351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.607439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.607458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.607486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.607507 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.711501 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.711861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.712059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.712374 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.712590 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.816246 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.816312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.816327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.816350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.816365 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.920023 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.920087 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.920103 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.920130 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.920148 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.022937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.022992 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.023010 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.023033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.023050 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.126650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.126726 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.126746 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.126772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.126790 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.186987 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.187770 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.187830 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:56 crc kubenswrapper[4778]: E0318 09:03:56.187899 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:56 crc kubenswrapper[4778]: E0318 09:03:56.188383 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:56 crc kubenswrapper[4778]: E0318 09:03:56.188612 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:56 crc kubenswrapper[4778]: E0318 09:03:56.189764 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:56 crc kubenswrapper[4778]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:56 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:56 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:56 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:56 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:56 crc kubenswrapper[4778]: fi Mar 18 09:03:56 crc kubenswrapper[4778]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 09:03:56 crc kubenswrapper[4778]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 09:03:56 crc kubenswrapper[4778]: ho_enable="--enable-hybrid-overlay" Mar 18 09:03:56 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 09:03:56 crc kubenswrapper[4778]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 09:03:56 crc kubenswrapper[4778]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 09:03:56 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:56 crc kubenswrapper[4778]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 09:03:56 crc kubenswrapper[4778]: --webhook-host=127.0.0.1 \ Mar 18 09:03:56 crc kubenswrapper[4778]: --webhook-port=9743 \ Mar 18 09:03:56 crc kubenswrapper[4778]: ${ho_enable} \ Mar 18 09:03:56 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:03:56 crc kubenswrapper[4778]: --disable-approver \ Mar 18 09:03:56 crc kubenswrapper[4778]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 09:03:56 crc kubenswrapper[4778]: --wait-for-kubernetes-api=200s \ Mar 18 09:03:56 crc kubenswrapper[4778]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 09:03:56 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:56 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:56 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:56 crc kubenswrapper[4778]: E0318 09:03:56.192505 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:56 crc kubenswrapper[4778]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:56 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:56 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:56 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:56 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:56 crc kubenswrapper[4778]: fi Mar 18 09:03:56 crc kubenswrapper[4778]: Mar 18 09:03:56 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 09:03:56 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:56 crc kubenswrapper[4778]: --disable-webhook \ Mar 18 09:03:56 crc kubenswrapper[4778]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 09:03:56 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:56 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:56 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:56 crc kubenswrapper[4778]: E0318 09:03:56.194046 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.229640 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.229698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.229715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.229740 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.229757 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.333633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.333716 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.333733 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.333759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.333778 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.437264 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.437340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.437365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.437404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.437426 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.540482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.540563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.540587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.540620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.540645 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.643855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.643902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.643919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.643944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.643966 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.746855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.746945 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.746964 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.746990 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.747009 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.850833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.850895 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.850912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.850935 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.850956 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.954289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.954371 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.954389 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.954413 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.954432 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.056680 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.056739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.056756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.056781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.056798 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.159567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.159655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.159904 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.159940 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.159964 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.263451 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.263517 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.263534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.263597 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.263615 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.367146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.367258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.367277 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.367300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.367319 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.474628 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.474803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.475537 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.475590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.475612 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.578828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.578912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.578946 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.578978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.579000 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.682175 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.682242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.682251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.682267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.682281 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.786612 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.786660 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.786669 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.786686 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.786698 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.890774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.890829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.890837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.890855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.890866 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.993607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.993678 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.993704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.993742 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.993771 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.096800 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.096859 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.096873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.096892 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.096905 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.186768 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.186765 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.186765 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:58 crc kubenswrapper[4778]: E0318 09:03:58.187655 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:58 crc kubenswrapper[4778]: E0318 09:03:58.187801 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:58 crc kubenswrapper[4778]: E0318 09:03:58.188090 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.200405 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.200458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.200471 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.200493 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.200508 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.303915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.304004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.304023 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.304045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.304062 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.407014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.407121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.407148 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.407177 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.407232 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.510946 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.511014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.511034 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.511059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.511079 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.614400 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.614460 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.614478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.614503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.614522 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.718184 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.718288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.718306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.718331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.718348 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.822054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.822164 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.822187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.822257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.822322 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.927063 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.927168 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.927232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.927323 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.927386 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.030671 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.030731 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.030750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.030772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.030791 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.133983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.134048 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.134065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.134088 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.134108 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.237124 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.237227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.237255 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.237286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.237311 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.340823 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.340890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.340910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.340939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.340959 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.400473 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-dfnnp"] Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.401006 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.405753 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.406696 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.409049 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.425315 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9b2b\" (UniqueName: \"kubernetes.io/projected/8cf64307-e191-476a-902b-93001adc0b16-kube-api-access-f9b2b\") pod \"node-resolver-dfnnp\" (UID: \"8cf64307-e191-476a-902b-93001adc0b16\") " pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.425372 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8cf64307-e191-476a-902b-93001adc0b16-hosts-file\") pod \"node-resolver-dfnnp\" (UID: \"8cf64307-e191-476a-902b-93001adc0b16\") " pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.438359 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.443492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.443536 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.443549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.443579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.443595 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.463776 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.486880 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.506312 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.517889 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.526625 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8cf64307-e191-476a-902b-93001adc0b16-hosts-file\") pod \"node-resolver-dfnnp\" (UID: \"8cf64307-e191-476a-902b-93001adc0b16\") " pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.526679 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9b2b\" (UniqueName: \"kubernetes.io/projected/8cf64307-e191-476a-902b-93001adc0b16-kube-api-access-f9b2b\") pod \"node-resolver-dfnnp\" (UID: \"8cf64307-e191-476a-902b-93001adc0b16\") " pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.526826 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8cf64307-e191-476a-902b-93001adc0b16-hosts-file\") pod \"node-resolver-dfnnp\" (UID: \"8cf64307-e191-476a-902b-93001adc0b16\") " pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.540350 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.546629 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.546690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.546708 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.546734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.546753 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.550785 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9b2b\" (UniqueName: \"kubernetes.io/projected/8cf64307-e191-476a-902b-93001adc0b16-kube-api-access-f9b2b\") pod \"node-resolver-dfnnp\" (UID: \"8cf64307-e191-476a-902b-93001adc0b16\") " pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.555986 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.570652 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.583599 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.656706 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.656775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.656795 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.656821 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.656840 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.727263 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.760912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.760966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.760983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.761007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.761024 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: W0318 09:03:59.771952 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cf64307_e191_476a_902b_93001adc0b16.slice/crio-3fe873f767a719bf5f4f5ff009d2646ec2788a92e8356376e715112f3ab6128c WatchSource:0}: Error finding container 3fe873f767a719bf5f4f5ff009d2646ec2788a92e8356376e715112f3ab6128c: Status 404 returned error can't find the container with id 3fe873f767a719bf5f4f5ff009d2646ec2788a92e8356376e715112f3ab6128c Mar 18 09:03:59 crc kubenswrapper[4778]: E0318 09:03:59.775727 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:59 crc kubenswrapper[4778]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 09:03:59 crc kubenswrapper[4778]: set -uo pipefail Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 09:03:59 crc kubenswrapper[4778]: HOSTS_FILE="/etc/hosts" Mar 18 09:03:59 crc kubenswrapper[4778]: TEMP_FILE="/etc/hosts.tmp" Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: # Make a temporary file with the old hosts file's attributes. Mar 18 09:03:59 crc kubenswrapper[4778]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 09:03:59 crc kubenswrapper[4778]: echo "Failed to preserve hosts file. Exiting." Mar 18 09:03:59 crc kubenswrapper[4778]: exit 1 Mar 18 09:03:59 crc kubenswrapper[4778]: fi Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: while true; do Mar 18 09:03:59 crc kubenswrapper[4778]: declare -A svc_ips Mar 18 09:03:59 crc kubenswrapper[4778]: for svc in "${services[@]}"; do Mar 18 09:03:59 crc kubenswrapper[4778]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 09:03:59 crc kubenswrapper[4778]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 09:03:59 crc kubenswrapper[4778]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 09:03:59 crc kubenswrapper[4778]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 09:03:59 crc kubenswrapper[4778]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:03:59 crc kubenswrapper[4778]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:03:59 crc kubenswrapper[4778]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:03:59 crc kubenswrapper[4778]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 09:03:59 crc kubenswrapper[4778]: for i in ${!cmds[*]} Mar 18 09:03:59 crc kubenswrapper[4778]: do Mar 18 09:03:59 crc kubenswrapper[4778]: ips=($(eval "${cmds[i]}")) Mar 18 09:03:59 crc kubenswrapper[4778]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 09:03:59 crc kubenswrapper[4778]: svc_ips["${svc}"]="${ips[@]}" Mar 18 09:03:59 crc kubenswrapper[4778]: break Mar 18 09:03:59 crc kubenswrapper[4778]: fi Mar 18 09:03:59 crc kubenswrapper[4778]: done Mar 18 09:03:59 crc kubenswrapper[4778]: done Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: # Update /etc/hosts only if we get valid service IPs Mar 18 09:03:59 crc kubenswrapper[4778]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 09:03:59 crc kubenswrapper[4778]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 09:03:59 crc kubenswrapper[4778]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 09:03:59 crc kubenswrapper[4778]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 09:03:59 crc kubenswrapper[4778]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 09:03:59 crc kubenswrapper[4778]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 09:03:59 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:03:59 crc kubenswrapper[4778]: continue Mar 18 09:03:59 crc kubenswrapper[4778]: fi Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: # Append resolver entries for services Mar 18 09:03:59 crc kubenswrapper[4778]: rc=0 Mar 18 09:03:59 crc kubenswrapper[4778]: for svc in "${!svc_ips[@]}"; do Mar 18 09:03:59 crc kubenswrapper[4778]: for ip in ${svc_ips[${svc}]}; do Mar 18 09:03:59 crc kubenswrapper[4778]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 09:03:59 crc kubenswrapper[4778]: done Mar 18 09:03:59 crc kubenswrapper[4778]: done Mar 18 09:03:59 crc kubenswrapper[4778]: if [[ $rc -ne 0 ]]; then Mar 18 09:03:59 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:03:59 crc kubenswrapper[4778]: continue Mar 18 09:03:59 crc kubenswrapper[4778]: fi Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 09:03:59 crc kubenswrapper[4778]: # Replace /etc/hosts with our modified version if needed Mar 18 09:03:59 crc kubenswrapper[4778]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 09:03:59 crc kubenswrapper[4778]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 09:03:59 crc kubenswrapper[4778]: fi Mar 18 09:03:59 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:03:59 crc kubenswrapper[4778]: unset svc_ips Mar 18 09:03:59 crc kubenswrapper[4778]: done Mar 18 09:03:59 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f9b2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-dfnnp_openshift-dns(8cf64307-e191-476a-902b-93001adc0b16): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:59 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:59 crc kubenswrapper[4778]: E0318 09:03:59.776992 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-dfnnp" podUID="8cf64307-e191-476a-902b-93001adc0b16" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.795856 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-r2lvf"] Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.796602 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.797939 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xkfx8"] Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.799077 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-56rc7"] Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.799397 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.799673 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.799785 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.800997 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.801070 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.801191 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.802008 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.802161 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.807434 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.807476 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.807499 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.807596 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.807600 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.808024 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.823170 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829715 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjjcr\" (UniqueName: \"kubernetes.io/projected/7243f983-24d5-48ef-858b-5f4049a82acc-kube-api-access-gjjcr\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829779 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dce973f3-25e6-4536-87cc-9b46499ad7cf-cni-binary-copy\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829816 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-k8s-cni-cncf-io\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829867 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-os-release\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829899 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7243f983-24d5-48ef-858b-5f4049a82acc-proxy-tls\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829933 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-cnibin\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829961 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-hostroot\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829990 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-multus-certs\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7243f983-24d5-48ef-858b-5f4049a82acc-rootfs\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2zsl\" (UniqueName: \"kubernetes.io/projected/dce973f3-25e6-4536-87cc-9b46499ad7cf-kube-api-access-x2zsl\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830119 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-conf-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830167 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-cni-multus\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830252 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830309 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-os-release\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cxhf\" (UniqueName: \"kubernetes.io/projected/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-kube-api-access-4cxhf\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830466 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-netns\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830516 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-cni-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830580 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cnibin\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830637 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830681 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-socket-dir-parent\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830723 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-etc-kubernetes\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830757 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-system-cni-dir\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830787 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830818 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7243f983-24d5-48ef-858b-5f4049a82acc-mcd-auth-proxy-config\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830863 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-daemon-config\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830914 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-system-cni-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830951 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-cni-bin\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830983 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-kubelet\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.841633 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.857415 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.863555 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.863608 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.863623 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.863645 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.863661 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.874341 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.887902 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.905288 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.924882 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.931779 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-system-cni-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.931864 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-cni-bin\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.931904 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-kubelet\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.931982 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjjcr\" (UniqueName: \"kubernetes.io/projected/7243f983-24d5-48ef-858b-5f4049a82acc-kube-api-access-gjjcr\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932023 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dce973f3-25e6-4536-87cc-9b46499ad7cf-cni-binary-copy\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932023 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-cni-bin\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932067 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-k8s-cni-cncf-io\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932110 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-kubelet\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932153 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-os-release\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932155 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-k8s-cni-cncf-io\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932022 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-system-cni-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932186 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7243f983-24d5-48ef-858b-5f4049a82acc-proxy-tls\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932362 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-cnibin\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932390 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-hostroot\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932433 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-multus-certs\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932454 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-cnibin\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932532 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-os-release\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932565 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-multus-certs\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932532 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-hostroot\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932581 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7243f983-24d5-48ef-858b-5f4049a82acc-rootfs\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932684 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7243f983-24d5-48ef-858b-5f4049a82acc-rootfs\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932685 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2zsl\" (UniqueName: \"kubernetes.io/projected/dce973f3-25e6-4536-87cc-9b46499ad7cf-kube-api-access-x2zsl\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932730 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-conf-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932774 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-cni-multus\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932802 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-conf-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932814 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932837 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-cni-multus\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932854 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-os-release\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932896 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cxhf\" (UniqueName: \"kubernetes.io/projected/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-kube-api-access-4cxhf\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932930 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-netns\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932964 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-cni-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932993 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-os-release\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933002 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cnibin\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933040 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cnibin\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933068 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933081 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-netns\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933093 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-socket-dir-parent\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-cni-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933129 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-socket-dir-parent\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933168 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-etc-kubernetes\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933241 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-etc-kubernetes\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933248 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-system-cni-dir\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933292 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933334 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7243f983-24d5-48ef-858b-5f4049a82acc-mcd-auth-proxy-config\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933355 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-system-cni-dir\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933378 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-daemon-config\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.934261 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dce973f3-25e6-4536-87cc-9b46499ad7cf-cni-binary-copy\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.934414 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.934531 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-daemon-config\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.934577 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7243f983-24d5-48ef-858b-5f4049a82acc-mcd-auth-proxy-config\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.935146 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.935170 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.937787 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.938370 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7243f983-24d5-48ef-858b-5f4049a82acc-proxy-tls\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.950146 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjjcr\" (UniqueName: \"kubernetes.io/projected/7243f983-24d5-48ef-858b-5f4049a82acc-kube-api-access-gjjcr\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.952123 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.958458 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2zsl\" (UniqueName: \"kubernetes.io/projected/dce973f3-25e6-4536-87cc-9b46499ad7cf-kube-api-access-x2zsl\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.962408 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cxhf\" (UniqueName: \"kubernetes.io/projected/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-kube-api-access-4cxhf\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.966557 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.966597 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.966613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.966638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.966654 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.967723 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.983049 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.998377 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.010779 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.026102 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.046766 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.062741 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.070090 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.070152 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.070167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.070188 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.070224 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.076916 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.095707 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.114463 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.127464 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.132044 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r2lvf" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.144424 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.146668 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:04:00 crc kubenswrapper[4778]: W0318 09:04:00.158361 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddce973f3_25e6_4536_87cc_9b46499ad7cf.slice/crio-2b56e0b581b82099bec21992e25b801c274f15b004cee836d238158d8adaa942 WatchSource:0}: Error finding container 2b56e0b581b82099bec21992e25b801c274f15b004cee836d238158d8adaa942: Status 404 returned error can't find the container with id 2b56e0b581b82099bec21992e25b801c274f15b004cee836d238158d8adaa942 Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.159515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.160505 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.170488 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:00 crc kubenswrapper[4778]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 09:04:00 crc kubenswrapper[4778]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 09:04:00 crc kubenswrapper[4778]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2zsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-r2lvf_openshift-multus(dce973f3-25e6-4536-87cc-9b46499ad7cf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:00 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.172106 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-r2lvf" podUID="dce973f3-25e6-4536-87cc-9b46499ad7cf" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.173058 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.173160 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.173176 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.173208 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.173219 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.176288 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjjcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.176470 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g2qth"] Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.178430 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.179747 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjjcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.180547 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.180974 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.181451 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.181719 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.181740 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.182150 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.182393 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.182482 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 09:04:00 crc kubenswrapper[4778]: W0318 09:04:00.184663 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1698c21_24a7_4338_a0ad_dd110c1ba2f2.slice/crio-9ab054141923d09f0e8931cd7a26bdf8d6372005e8578ff1891fddf501c4d056 WatchSource:0}: Error finding container 9ab054141923d09f0e8931cd7a26bdf8d6372005e8578ff1891fddf501c4d056: Status 404 returned error can't find the container with id 9ab054141923d09f0e8931cd7a26bdf8d6372005e8578ff1891fddf501c4d056 Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.186222 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.186238 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.186339 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.186410 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.186510 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.186612 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.188070 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cxhf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-xkfx8_openshift-multus(b1698c21-24a7-4338-a0ad-dd110c1ba2f2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.189669 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" podUID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.192561 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.203971 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.218325 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.228822 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.236675 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-etc-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.236743 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-ovn\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.236774 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-bin\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.236800 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8g6f\" (UniqueName: \"kubernetes.io/projected/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-kube-api-access-b8g6f\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.236926 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-kubelet\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237020 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-systemd-units\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-systemd\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237068 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-var-lib-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-slash\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237122 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237139 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-script-lib\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237237 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-netns\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237264 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-env-overrides\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237283 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237364 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-node-log\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237443 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-log-socket\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237463 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-netd\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237728 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovn-node-metrics-cert\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237752 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-ovn-kubernetes\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237784 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-config\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.239397 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.250656 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.263193 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.276074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.276116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.276125 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.276142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.276154 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.279799 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.298219 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.316447 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.330231 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.338879 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-kubelet\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.338985 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-systemd-units\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339024 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-systemd\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339056 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-var-lib-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339098 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339134 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-slash\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339248 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-script-lib\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339284 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-netns\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339319 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-env-overrides\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339337 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339430 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339354 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339506 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-node-log\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339518 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339547 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-log-socket\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339574 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-kubelet\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-netd\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339607 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-systemd-units\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339621 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovn-node-metrics-cert\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339641 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-systemd\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339658 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-ovn-kubernetes\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339671 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-var-lib-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339703 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-config\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-etc-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339777 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-ovn\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339812 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8g6f\" (UniqueName: \"kubernetes.io/projected/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-kube-api-access-b8g6f\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339852 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-bin\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339938 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-bin\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339990 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-node-log\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.340037 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-log-socket\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.340083 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-netd\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.340086 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-slash\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.340838 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-netns\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.340886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-etc-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.340995 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-env-overrides\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.341060 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-ovn\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.341505 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-config\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.341604 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-ovn-kubernetes\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.341800 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-script-lib\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.346053 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovn-node-metrics-cert\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.360676 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.363002 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8g6f\" (UniqueName: \"kubernetes.io/projected/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-kube-api-access-b8g6f\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.378893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.378949 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.378958 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.378982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.378994 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.482926 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.483013 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.483037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.483064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.483083 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.501710 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: W0318 09:04:00.520167 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef97d63e_1caf_44c9_ac0c_9b03dbd05113.slice/crio-0662ffa0aa3c3fc99a955a63b995acc6a492d7cf6b911968c3e8039e26cdcb7f WatchSource:0}: Error finding container 0662ffa0aa3c3fc99a955a63b995acc6a492d7cf6b911968c3e8039e26cdcb7f: Status 404 returned error can't find the container with id 0662ffa0aa3c3fc99a955a63b995acc6a492d7cf6b911968c3e8039e26cdcb7f Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.524141 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:00 crc kubenswrapper[4778]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 09:04:00 crc kubenswrapper[4778]: apiVersion: v1 Mar 18 09:04:00 crc kubenswrapper[4778]: clusters: Mar 18 09:04:00 crc kubenswrapper[4778]: - cluster: Mar 18 09:04:00 crc kubenswrapper[4778]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 09:04:00 crc kubenswrapper[4778]: server: https://api-int.crc.testing:6443 Mar 18 09:04:00 crc kubenswrapper[4778]: name: default-cluster Mar 18 09:04:00 crc kubenswrapper[4778]: contexts: Mar 18 09:04:00 crc kubenswrapper[4778]: - context: Mar 18 09:04:00 crc kubenswrapper[4778]: cluster: default-cluster Mar 18 09:04:00 crc kubenswrapper[4778]: namespace: default Mar 18 09:04:00 crc kubenswrapper[4778]: user: default-auth Mar 18 09:04:00 crc kubenswrapper[4778]: name: default-context Mar 18 09:04:00 crc kubenswrapper[4778]: current-context: default-context Mar 18 09:04:00 crc kubenswrapper[4778]: kind: Config Mar 18 09:04:00 crc kubenswrapper[4778]: preferences: {} Mar 18 09:04:00 crc kubenswrapper[4778]: users: Mar 18 09:04:00 crc kubenswrapper[4778]: - name: default-auth Mar 18 09:04:00 crc kubenswrapper[4778]: user: Mar 18 09:04:00 crc kubenswrapper[4778]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 09:04:00 crc kubenswrapper[4778]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 09:04:00 crc kubenswrapper[4778]: EOF Mar 18 09:04:00 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b8g6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:00 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.526261 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.589078 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.589153 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.589175 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.589234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.589269 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.669875 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"0662ffa0aa3c3fc99a955a63b995acc6a492d7cf6b911968c3e8039e26cdcb7f"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.672721 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerStarted","Data":"9ab054141923d09f0e8931cd7a26bdf8d6372005e8578ff1891fddf501c4d056"} Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.672940 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:00 crc kubenswrapper[4778]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 09:04:00 crc kubenswrapper[4778]: apiVersion: v1 Mar 18 09:04:00 crc kubenswrapper[4778]: clusters: Mar 18 09:04:00 crc kubenswrapper[4778]: - cluster: Mar 18 09:04:00 crc kubenswrapper[4778]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 09:04:00 crc kubenswrapper[4778]: server: https://api-int.crc.testing:6443 Mar 18 09:04:00 crc kubenswrapper[4778]: name: default-cluster Mar 18 09:04:00 crc kubenswrapper[4778]: contexts: Mar 18 09:04:00 crc kubenswrapper[4778]: - context: Mar 18 09:04:00 crc kubenswrapper[4778]: cluster: default-cluster Mar 18 09:04:00 crc kubenswrapper[4778]: namespace: default Mar 18 09:04:00 crc kubenswrapper[4778]: user: default-auth Mar 18 09:04:00 crc kubenswrapper[4778]: name: default-context Mar 18 09:04:00 crc kubenswrapper[4778]: current-context: default-context Mar 18 09:04:00 crc kubenswrapper[4778]: kind: Config Mar 18 09:04:00 crc kubenswrapper[4778]: preferences: {} Mar 18 09:04:00 crc kubenswrapper[4778]: users: Mar 18 09:04:00 crc kubenswrapper[4778]: - name: default-auth Mar 18 09:04:00 crc kubenswrapper[4778]: user: Mar 18 09:04:00 crc kubenswrapper[4778]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 09:04:00 crc kubenswrapper[4778]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 09:04:00 crc kubenswrapper[4778]: EOF Mar 18 09:04:00 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b8g6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:00 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.675531 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"8c20d90ff1890fad7b9e40ab7f878094324a004a1781e12dcf95fc402ccd00c9"} Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.676229 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.677143 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cxhf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-xkfx8_openshift-multus(b1698c21-24a7-4338-a0ad-dd110c1ba2f2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.678401 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" podUID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.679631 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerStarted","Data":"2b56e0b581b82099bec21992e25b801c274f15b004cee836d238158d8adaa942"} Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.681715 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:00 crc kubenswrapper[4778]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 09:04:00 crc kubenswrapper[4778]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 09:04:00 crc kubenswrapper[4778]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2zsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-r2lvf_openshift-multus(dce973f3-25e6-4536-87cc-9b46499ad7cf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:00 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.681937 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjjcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.683518 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-r2lvf" podUID="dce973f3-25e6-4536-87cc-9b46499ad7cf" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.684129 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dfnnp" event={"ID":"8cf64307-e191-476a-902b-93001adc0b16","Type":"ContainerStarted","Data":"3fe873f767a719bf5f4f5ff009d2646ec2788a92e8356376e715112f3ab6128c"} Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.684876 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjjcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.686687 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.687661 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:00 crc kubenswrapper[4778]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 09:04:00 crc kubenswrapper[4778]: set -uo pipefail Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 09:04:00 crc kubenswrapper[4778]: HOSTS_FILE="/etc/hosts" Mar 18 09:04:00 crc kubenswrapper[4778]: TEMP_FILE="/etc/hosts.tmp" Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: # Make a temporary file with the old hosts file's attributes. Mar 18 09:04:00 crc kubenswrapper[4778]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 09:04:00 crc kubenswrapper[4778]: echo "Failed to preserve hosts file. Exiting." Mar 18 09:04:00 crc kubenswrapper[4778]: exit 1 Mar 18 09:04:00 crc kubenswrapper[4778]: fi Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: while true; do Mar 18 09:04:00 crc kubenswrapper[4778]: declare -A svc_ips Mar 18 09:04:00 crc kubenswrapper[4778]: for svc in "${services[@]}"; do Mar 18 09:04:00 crc kubenswrapper[4778]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 09:04:00 crc kubenswrapper[4778]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 09:04:00 crc kubenswrapper[4778]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 09:04:00 crc kubenswrapper[4778]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 09:04:00 crc kubenswrapper[4778]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:04:00 crc kubenswrapper[4778]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:04:00 crc kubenswrapper[4778]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:04:00 crc kubenswrapper[4778]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 09:04:00 crc kubenswrapper[4778]: for i in ${!cmds[*]} Mar 18 09:04:00 crc kubenswrapper[4778]: do Mar 18 09:04:00 crc kubenswrapper[4778]: ips=($(eval "${cmds[i]}")) Mar 18 09:04:00 crc kubenswrapper[4778]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 09:04:00 crc kubenswrapper[4778]: svc_ips["${svc}"]="${ips[@]}" Mar 18 09:04:00 crc kubenswrapper[4778]: break Mar 18 09:04:00 crc kubenswrapper[4778]: fi Mar 18 09:04:00 crc kubenswrapper[4778]: done Mar 18 09:04:00 crc kubenswrapper[4778]: done Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: # Update /etc/hosts only if we get valid service IPs Mar 18 09:04:00 crc kubenswrapper[4778]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 09:04:00 crc kubenswrapper[4778]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 09:04:00 crc kubenswrapper[4778]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 09:04:00 crc kubenswrapper[4778]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 09:04:00 crc kubenswrapper[4778]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 09:04:00 crc kubenswrapper[4778]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 09:04:00 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:04:00 crc kubenswrapper[4778]: continue Mar 18 09:04:00 crc kubenswrapper[4778]: fi Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: # Append resolver entries for services Mar 18 09:04:00 crc kubenswrapper[4778]: rc=0 Mar 18 09:04:00 crc kubenswrapper[4778]: for svc in "${!svc_ips[@]}"; do Mar 18 09:04:00 crc kubenswrapper[4778]: for ip in ${svc_ips[${svc}]}; do Mar 18 09:04:00 crc kubenswrapper[4778]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 09:04:00 crc kubenswrapper[4778]: done Mar 18 09:04:00 crc kubenswrapper[4778]: done Mar 18 09:04:00 crc kubenswrapper[4778]: if [[ $rc -ne 0 ]]; then Mar 18 09:04:00 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:04:00 crc kubenswrapper[4778]: continue Mar 18 09:04:00 crc kubenswrapper[4778]: fi Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 09:04:00 crc kubenswrapper[4778]: # Replace /etc/hosts with our modified version if needed Mar 18 09:04:00 crc kubenswrapper[4778]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 09:04:00 crc kubenswrapper[4778]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 09:04:00 crc kubenswrapper[4778]: fi Mar 18 09:04:00 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:04:00 crc kubenswrapper[4778]: unset svc_ips Mar 18 09:04:00 crc kubenswrapper[4778]: done Mar 18 09:04:00 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f9b2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-dfnnp_openshift-dns(8cf64307-e191-476a-902b-93001adc0b16): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:00 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.688862 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-dfnnp" podUID="8cf64307-e191-476a-902b-93001adc0b16" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.691836 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.692715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.692803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.692831 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.692864 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.692926 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.707855 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.722977 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.741861 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.758817 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.775119 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.788703 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.795863 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.795915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.795929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.795952 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.795966 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.815945 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.834490 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.859660 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.877755 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.892921 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.899556 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.899631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.899647 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.899669 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.899687 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.913538 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.936306 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.954060 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.968933 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.995755 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.002916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.002980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.003001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.003031 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.003051 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.008947 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.024757 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.038516 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.050313 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.060602 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.072832 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.087281 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.098952 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.107898 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.107949 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.107960 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.107979 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.107992 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.116168 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.215703 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.215823 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.215883 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.215912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.215932 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.319989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.320052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.320071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.320096 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.320118 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.423657 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.423738 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.423756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.423781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.423799 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.526547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.526599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.526610 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.526628 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.526640 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.630106 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.630494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.630672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.630874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.631042 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.734459 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.734514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.734531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.734555 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.734573 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.837776 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.837834 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.837851 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.837873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.837891 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.940903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.940966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.940982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.941007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.941024 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.957532 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:04:01 crc kubenswrapper[4778]: E0318 09:04:01.957782 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:04:33.957747667 +0000 UTC m=+140.532492537 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.044422 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.044807 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.044929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.045071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.045226 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.058460 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.058529 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.058577 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.058610 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.058770 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.058846 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:34.058823234 +0000 UTC m=+140.633568104 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059116 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059172 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:34.059158933 +0000 UTC m=+140.633903803 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059383 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059433 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059451 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059537 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:34.059510382 +0000 UTC m=+140.634255232 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059657 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059722 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059752 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059891 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:34.059850291 +0000 UTC m=+140.634595311 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.148346 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.148431 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.148452 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.148485 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.148510 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.187238 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.187330 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.187415 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.187581 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.187909 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.187988 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.252521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.252581 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.252598 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.252623 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.252644 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.355912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.355987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.356009 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.356039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.356062 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.459310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.459387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.459408 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.459438 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.459457 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.563888 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.564289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.564479 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.564672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.564817 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.669475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.670960 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.671004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.671030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.671047 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.774311 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.774383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.774402 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.774429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.774447 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.878435 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.878510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.878532 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.878570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.878595 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.981580 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.981661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.981682 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.981721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.982181 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.085062 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.085496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.085646 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.085787 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.085956 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.102813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.102910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.102929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.102954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.102978 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: E0318 09:04:03.119561 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.126655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.126742 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.126767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.126798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.126823 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: E0318 09:04:03.143796 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.148568 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.148731 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.148834 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.148939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.149025 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: E0318 09:04:03.166157 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.171692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.171856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.171942 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.172053 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.172132 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: E0318 09:04:03.186054 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.187188 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.199748 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.199799 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.199834 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.199851 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.199865 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: E0318 09:04:03.219857 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: E0318 09:04:03.220364 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.223368 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.223462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.223479 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.223594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.223633 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.326927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.326992 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.327008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.327029 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.327042 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.429810 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.429868 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.429887 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.429916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.429935 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.533768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.533837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.533853 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.533880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.533901 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.605732 4778 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.637329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.637384 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.637403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.637431 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.637455 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.696896 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.699813 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.700496 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.714503 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.732333 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.744496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.744535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.744546 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.744561 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.744572 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.765226 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.781549 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.797040 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.812183 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.825592 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.841711 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.847331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.847392 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.847410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.847436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.847454 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.857784 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.874981 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.887846 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.901222 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.911623 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.950866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.950902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.950914 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.950929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.950938 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.052999 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.053040 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.053050 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.053066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.053077 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.156290 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.156331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.156340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.156353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.156362 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.186258 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.186333 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:04 crc kubenswrapper[4778]: E0318 09:04:04.186387 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.186400 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:04 crc kubenswrapper[4778]: E0318 09:04:04.186487 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:04 crc kubenswrapper[4778]: E0318 09:04:04.186641 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.202134 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.218931 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.231916 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.251058 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.258324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.258377 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.258391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.258410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.258423 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.280866 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.304286 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.315760 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.324466 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.340430 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.360160 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.362058 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.362147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.362170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.362233 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.362257 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.376078 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.388959 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.399659 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.465695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.465754 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.465766 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.465782 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.465794 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.569485 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.569540 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.569549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.569567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.569578 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.672527 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.672593 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.672603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.672619 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.672630 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.775262 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.775351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.775369 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.775406 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.775428 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.878936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.878983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.878993 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.879014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.879027 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.982166 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.982244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.982254 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.982272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.982284 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.085319 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.085378 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.085388 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.085411 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.085431 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.188620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.188690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.188700 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.188722 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.188735 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.291859 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.291916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.291927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.291949 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.291963 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.395993 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.396052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.396065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.396084 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.396100 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.500419 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.500475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.500489 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.500519 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.500536 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.605128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.605258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.605288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.605321 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.605346 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.708577 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.708687 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.708709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.708778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.708798 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.812487 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.812578 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.812607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.812637 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.812658 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.917080 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.917159 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.917175 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.917219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.917235 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.021274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.021354 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.021383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.021414 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.021432 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.098354 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9f2bp"] Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.098834 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.103907 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.105492 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.106608 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.106834 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.126811 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.126873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.126890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.126919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.126937 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.129519 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.139057 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.157160 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.171567 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.186468 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.186542 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.186605 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.186709 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.186756 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.186925 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.193828 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.210097 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-serviceca\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.210180 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-host\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.210252 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grrhl\" (UniqueName: \"kubernetes.io/projected/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-kube-api-access-grrhl\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.218106 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.229228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.229285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.229299 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.229317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.229330 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.232001 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.242646 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.259584 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.271440 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.283103 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.298342 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.310486 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.311305 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-serviceca\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.311389 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-host\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.311425 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grrhl\" (UniqueName: \"kubernetes.io/projected/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-kube-api-access-grrhl\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.311517 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-host\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.312348 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-serviceca\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.322841 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.332653 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grrhl\" (UniqueName: \"kubernetes.io/projected/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-kube-api-access-grrhl\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.332680 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.332762 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.332781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.332805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.332820 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.426590 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.437234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.437278 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.437294 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.437317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.437333 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: W0318 09:04:06.445457 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69b256e9_a9ba_4e2e_9a39_6d9ffa7fa6b7.slice/crio-d8552cce5c839edf2a4add9843ca9cf1c7697ee87571e7fc2ca14bc8665322e4 WatchSource:0}: Error finding container d8552cce5c839edf2a4add9843ca9cf1c7697ee87571e7fc2ca14bc8665322e4: Status 404 returned error can't find the container with id d8552cce5c839edf2a4add9843ca9cf1c7697ee87571e7fc2ca14bc8665322e4 Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.449031 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:06 crc kubenswrapper[4778]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 09:04:06 crc kubenswrapper[4778]: while [ true ]; Mar 18 09:04:06 crc kubenswrapper[4778]: do Mar 18 09:04:06 crc kubenswrapper[4778]: for f in $(ls /tmp/serviceca); do Mar 18 09:04:06 crc kubenswrapper[4778]: echo $f Mar 18 09:04:06 crc kubenswrapper[4778]: ca_file_path="/tmp/serviceca/${f}" Mar 18 09:04:06 crc kubenswrapper[4778]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 09:04:06 crc kubenswrapper[4778]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 09:04:06 crc kubenswrapper[4778]: if [ -e "${reg_dir_path}" ]; then Mar 18 09:04:06 crc kubenswrapper[4778]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 09:04:06 crc kubenswrapper[4778]: else Mar 18 09:04:06 crc kubenswrapper[4778]: mkdir $reg_dir_path Mar 18 09:04:06 crc kubenswrapper[4778]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 09:04:06 crc kubenswrapper[4778]: fi Mar 18 09:04:06 crc kubenswrapper[4778]: done Mar 18 09:04:06 crc kubenswrapper[4778]: for d in $(ls /etc/docker/certs.d); do Mar 18 09:04:06 crc kubenswrapper[4778]: echo $d Mar 18 09:04:06 crc kubenswrapper[4778]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 09:04:06 crc kubenswrapper[4778]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 09:04:06 crc kubenswrapper[4778]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 09:04:06 crc kubenswrapper[4778]: rm -rf /etc/docker/certs.d/$d Mar 18 09:04:06 crc kubenswrapper[4778]: fi Mar 18 09:04:06 crc kubenswrapper[4778]: done Mar 18 09:04:06 crc kubenswrapper[4778]: sleep 60 & wait ${!} Mar 18 09:04:06 crc kubenswrapper[4778]: done Mar 18 09:04:06 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grrhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-9f2bp_openshift-image-registry(69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:06 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.451343 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-9f2bp" podUID="69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.540948 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.541017 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.541039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.541067 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.541090 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.644082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.644134 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.644149 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.644169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.644185 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.710684 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9f2bp" event={"ID":"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7","Type":"ContainerStarted","Data":"d8552cce5c839edf2a4add9843ca9cf1c7697ee87571e7fc2ca14bc8665322e4"} Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.712992 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:06 crc kubenswrapper[4778]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 09:04:06 crc kubenswrapper[4778]: while [ true ]; Mar 18 09:04:06 crc kubenswrapper[4778]: do Mar 18 09:04:06 crc kubenswrapper[4778]: for f in $(ls /tmp/serviceca); do Mar 18 09:04:06 crc kubenswrapper[4778]: echo $f Mar 18 09:04:06 crc kubenswrapper[4778]: ca_file_path="/tmp/serviceca/${f}" Mar 18 09:04:06 crc kubenswrapper[4778]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 09:04:06 crc kubenswrapper[4778]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 09:04:06 crc kubenswrapper[4778]: if [ -e "${reg_dir_path}" ]; then Mar 18 09:04:06 crc kubenswrapper[4778]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 09:04:06 crc kubenswrapper[4778]: else Mar 18 09:04:06 crc kubenswrapper[4778]: mkdir $reg_dir_path Mar 18 09:04:06 crc kubenswrapper[4778]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 09:04:06 crc kubenswrapper[4778]: fi Mar 18 09:04:06 crc kubenswrapper[4778]: done Mar 18 09:04:06 crc kubenswrapper[4778]: for d in $(ls /etc/docker/certs.d); do Mar 18 09:04:06 crc kubenswrapper[4778]: echo $d Mar 18 09:04:06 crc kubenswrapper[4778]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 09:04:06 crc kubenswrapper[4778]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 09:04:06 crc kubenswrapper[4778]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 09:04:06 crc kubenswrapper[4778]: rm -rf /etc/docker/certs.d/$d Mar 18 09:04:06 crc kubenswrapper[4778]: fi Mar 18 09:04:06 crc kubenswrapper[4778]: done Mar 18 09:04:06 crc kubenswrapper[4778]: sleep 60 & wait ${!} Mar 18 09:04:06 crc kubenswrapper[4778]: done Mar 18 09:04:06 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grrhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-9f2bp_openshift-image-registry(69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:06 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.714315 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-9f2bp" podUID="69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.731521 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.743792 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.747590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.747642 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.747656 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.747674 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.747686 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.773056 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.791828 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.806479 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.827088 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.843257 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.850956 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.851018 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.851036 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.851059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.851076 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.865264 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.884639 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.895615 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.906484 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.918736 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.930313 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.951763 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.953845 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.953922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.953937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.953962 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.953979 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.056579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.056620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.056635 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.056652 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.056681 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.159652 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.159720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.159739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.159766 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.159784 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: E0318 09:04:07.188757 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:07 crc kubenswrapper[4778]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:04:07 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:04:07 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:07 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:04:07 crc kubenswrapper[4778]: set +o allexport Mar 18 09:04:07 crc kubenswrapper[4778]: fi Mar 18 09:04:07 crc kubenswrapper[4778]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 09:04:07 crc kubenswrapper[4778]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 09:04:07 crc kubenswrapper[4778]: ho_enable="--enable-hybrid-overlay" Mar 18 09:04:07 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 09:04:07 crc kubenswrapper[4778]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 09:04:07 crc kubenswrapper[4778]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 09:04:07 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:04:07 crc kubenswrapper[4778]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 09:04:07 crc kubenswrapper[4778]: --webhook-host=127.0.0.1 \ Mar 18 09:04:07 crc kubenswrapper[4778]: --webhook-port=9743 \ Mar 18 09:04:07 crc kubenswrapper[4778]: ${ho_enable} \ Mar 18 09:04:07 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:04:07 crc kubenswrapper[4778]: --disable-approver \ Mar 18 09:04:07 crc kubenswrapper[4778]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 09:04:07 crc kubenswrapper[4778]: --wait-for-kubernetes-api=200s \ Mar 18 09:04:07 crc kubenswrapper[4778]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 09:04:07 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:04:07 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:07 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:07 crc kubenswrapper[4778]: E0318 09:04:07.188982 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:07 crc kubenswrapper[4778]: E0318 09:04:07.190115 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 09:04:07 crc kubenswrapper[4778]: E0318 09:04:07.191749 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:07 crc kubenswrapper[4778]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:04:07 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:04:07 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:07 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:04:07 crc kubenswrapper[4778]: set +o allexport Mar 18 09:04:07 crc kubenswrapper[4778]: fi Mar 18 09:04:07 crc kubenswrapper[4778]: Mar 18 09:04:07 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 09:04:07 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:04:07 crc kubenswrapper[4778]: --disable-webhook \ Mar 18 09:04:07 crc kubenswrapper[4778]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 09:04:07 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:04:07 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:07 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:07 crc kubenswrapper[4778]: E0318 09:04:07.193001 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.262158 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.262529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.262619 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.262717 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.262810 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.366686 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.366732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.366741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.366758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.366770 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.469280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.469474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.469495 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.469518 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.469536 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.573088 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.573146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.573165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.573190 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.573251 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.678065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.678600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.678809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.679044 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.679278 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.783685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.783745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.783761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.783787 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.783805 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.886267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.886572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.886663 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.886757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.886837 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.990251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.990372 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.990395 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.990424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.990442 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.094265 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.094340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.094364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.094393 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.094413 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.187675 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:08 crc kubenswrapper[4778]: E0318 09:04:08.188508 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.188568 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.188839 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:08 crc kubenswrapper[4778]: E0318 09:04:08.191897 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:08 crc kubenswrapper[4778]: E0318 09:04:08.191999 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.196363 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.196409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.196423 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.196439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.196451 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.299708 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.300117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.300188 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.300288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.300353 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.402881 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.402920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.402928 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.402944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.402953 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.505802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.505836 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.505843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.505872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.505882 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.609725 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.609796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.609818 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.609846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.609866 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.713659 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.713734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.713758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.713788 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.713816 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.817133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.817276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.817296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.817325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.817345 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.920920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.921011 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.921045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.921086 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.921114 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.024660 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.024725 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.024743 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.024769 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.024786 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.127778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.128544 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.128622 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.128789 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.129014 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: E0318 09:04:09.190660 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:09 crc kubenswrapper[4778]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 09:04:09 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:09 crc kubenswrapper[4778]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 09:04:09 crc kubenswrapper[4778]: source /etc/kubernetes/apiserver-url.env Mar 18 09:04:09 crc kubenswrapper[4778]: else Mar 18 09:04:09 crc kubenswrapper[4778]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 09:04:09 crc kubenswrapper[4778]: exit 1 Mar 18 09:04:09 crc kubenswrapper[4778]: fi Mar 18 09:04:09 crc kubenswrapper[4778]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 09:04:09 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:09 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:09 crc kubenswrapper[4778]: E0318 09:04:09.191946 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.233037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.233107 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.233124 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.233152 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.233175 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.335852 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.335907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.335919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.335938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.335951 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.439547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.439631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.439648 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.439672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.439690 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.543069 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.543138 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.543167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.543232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.543259 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.647325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.647416 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.647440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.647468 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.647489 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.750818 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.750891 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.750908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.750931 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.750948 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.854168 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.854241 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.854254 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.854274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.854291 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.957856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.957921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.958066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.958127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.958149 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.061906 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.061948 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.061959 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.061978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.061990 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.165688 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.165753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.165771 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.165798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.165817 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.186756 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.187027 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.187052 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:10 crc kubenswrapper[4778]: E0318 09:04:10.187140 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:10 crc kubenswrapper[4778]: E0318 09:04:10.187309 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:10 crc kubenswrapper[4778]: E0318 09:04:10.186964 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.269060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.269135 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.269154 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.269179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.269241 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.373046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.373232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.373261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.373295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.373318 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.477351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.477403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.477421 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.477446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.477464 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.581286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.581344 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.581361 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.581390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.581413 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.685249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.685311 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.685331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.685358 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.685381 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.788907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.788972 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.788990 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.789013 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.789033 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.892336 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.892409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.892445 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.892489 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.892525 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.996191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.996286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.996310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.996340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.996359 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.099655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.099718 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.099735 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.099759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.099775 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.197060 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.202814 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.202963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.203030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.203066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.203124 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.306440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.306488 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.306502 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.306528 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.306545 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.410276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.410383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.410452 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.410486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.410554 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.513618 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.513666 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.513685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.513707 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.513724 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.615804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.615867 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.615881 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.615911 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.615926 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.718681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.719032 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.719148 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.719314 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.719413 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.822679 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.822734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.822751 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.822774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.822795 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.926162 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.926232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.926244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.926260 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.926272 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.997816 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f"] Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.998841 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.001846 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.003735 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.018697 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.029761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.035827 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.035890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.035922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.035970 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.038733 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.071015 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.075813 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6q8m\" (UniqueName: \"kubernetes.io/projected/19777429-4133-4e70-b2dd-c61c54abdec4-kube-api-access-b6q8m\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.075890 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19777429-4133-4e70-b2dd-c61c54abdec4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.076041 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19777429-4133-4e70-b2dd-c61c54abdec4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.076271 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19777429-4133-4e70-b2dd-c61c54abdec4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.086383 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.113313 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.130135 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.140093 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.140343 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.140547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.140786 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.141023 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.143490 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.164337 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.176790 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19777429-4133-4e70-b2dd-c61c54abdec4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.177144 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6q8m\" (UniqueName: \"kubernetes.io/projected/19777429-4133-4e70-b2dd-c61c54abdec4-kube-api-access-b6q8m\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.177648 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19777429-4133-4e70-b2dd-c61c54abdec4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.178358 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19777429-4133-4e70-b2dd-c61c54abdec4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.176828 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.177845 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19777429-4133-4e70-b2dd-c61c54abdec4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.178733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19777429-4133-4e70-b2dd-c61c54abdec4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.184364 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19777429-4133-4e70-b2dd-c61c54abdec4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.186489 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.186692 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.187022 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.187128 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.187394 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.187493 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.195977 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.198733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6q8m\" (UniqueName: \"kubernetes.io/projected/19777429-4133-4e70-b2dd-c61c54abdec4-kube-api-access-b6q8m\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.208835 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.225817 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.241422 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.243584 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.243650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.243674 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.243704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.243725 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.255984 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.273434 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.285702 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.320249 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.342437 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:12 crc kubenswrapper[4778]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 18 09:04:12 crc kubenswrapper[4778]: set -euo pipefail Mar 18 09:04:12 crc kubenswrapper[4778]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 18 09:04:12 crc kubenswrapper[4778]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 18 09:04:12 crc kubenswrapper[4778]: # As the secret mount is optional we must wait for the files to be present. Mar 18 09:04:12 crc kubenswrapper[4778]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 18 09:04:12 crc kubenswrapper[4778]: TS=$(date +%s) Mar 18 09:04:12 crc kubenswrapper[4778]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 18 09:04:12 crc kubenswrapper[4778]: HAS_LOGGED_INFO=0 Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: log_missing_certs(){ Mar 18 09:04:12 crc kubenswrapper[4778]: CUR_TS=$(date +%s) Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 18 09:04:12 crc kubenswrapper[4778]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 18 09:04:12 crc kubenswrapper[4778]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 18 09:04:12 crc kubenswrapper[4778]: HAS_LOGGED_INFO=1 Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: } Mar 18 09:04:12 crc kubenswrapper[4778]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 18 09:04:12 crc kubenswrapper[4778]: log_missing_certs Mar 18 09:04:12 crc kubenswrapper[4778]: sleep 5 Mar 18 09:04:12 crc kubenswrapper[4778]: done Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 18 09:04:12 crc kubenswrapper[4778]: exec /usr/bin/kube-rbac-proxy \ Mar 18 09:04:12 crc kubenswrapper[4778]: --logtostderr \ Mar 18 09:04:12 crc kubenswrapper[4778]: --secure-listen-address=:9108 \ Mar 18 09:04:12 crc kubenswrapper[4778]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 18 09:04:12 crc kubenswrapper[4778]: --upstream=http://127.0.0.1:29108/ \ Mar 18 09:04:12 crc kubenswrapper[4778]: --tls-private-key-file=${TLS_PK} \ Mar 18 09:04:12 crc kubenswrapper[4778]: --tls-cert-file=${TLS_CERT} Mar 18 09:04:12 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6q8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-7262f_openshift-ovn-kubernetes(19777429-4133-4e70-b2dd-c61c54abdec4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:12 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.351705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.351769 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.351791 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.351820 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.351843 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.361889 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:12 crc kubenswrapper[4778]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:12 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:04:12 crc kubenswrapper[4778]: set +o allexport Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_join_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_join_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_transit_switch_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_transit_switch_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: dns_name_resolver_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "false" == "true" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: persistent_ips_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "true" == "true" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: # This is needed so that converting clusters from GA to TP Mar 18 09:04:12 crc kubenswrapper[4778]: # will rollout control plane pods as well Mar 18 09:04:12 crc kubenswrapper[4778]: network_segmentation_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: multi_network_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "true" == "true" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: multi_network_enabled_flag="--enable-multi-network" Mar 18 09:04:12 crc kubenswrapper[4778]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 18 09:04:12 crc kubenswrapper[4778]: exec /usr/bin/ovnkube \ Mar 18 09:04:12 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:04:12 crc kubenswrapper[4778]: --init-cluster-manager "${K8S_NODE}" \ Mar 18 09:04:12 crc kubenswrapper[4778]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 18 09:04:12 crc kubenswrapper[4778]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 18 09:04:12 crc kubenswrapper[4778]: --metrics-bind-address "127.0.0.1:29108" \ Mar 18 09:04:12 crc kubenswrapper[4778]: --metrics-enable-pprof \ Mar 18 09:04:12 crc kubenswrapper[4778]: --metrics-enable-config-duration \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v4_join_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v6_join_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${dns_name_resolver_enabled_flag} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${persistent_ips_enabled_flag} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${multi_network_enabled_flag} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${network_segmentation_enabled_flag} Mar 18 09:04:12 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6q8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-7262f_openshift-ovn-kubernetes(19777429-4133-4e70-b2dd-c61c54abdec4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:12 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.362964 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" podUID="19777429-4133-4e70-b2dd-c61c54abdec4" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.459041 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.459089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.459106 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.459129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.459145 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.562127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.562185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.562222 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.562244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.562258 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.664446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.664538 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.664549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.664565 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.664577 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.706634 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-9bc7s"] Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.707110 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.707180 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.729689 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" event={"ID":"19777429-4133-4e70-b2dd-c61c54abdec4","Type":"ContainerStarted","Data":"b77ec21dbeb1ef96b33093af249fe16903ad69bddc5ec9e4bf3972b97e6e679a"} Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.731618 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:12 crc kubenswrapper[4778]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 18 09:04:12 crc kubenswrapper[4778]: set -euo pipefail Mar 18 09:04:12 crc kubenswrapper[4778]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 18 09:04:12 crc kubenswrapper[4778]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 18 09:04:12 crc kubenswrapper[4778]: # As the secret mount is optional we must wait for the files to be present. Mar 18 09:04:12 crc kubenswrapper[4778]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 18 09:04:12 crc kubenswrapper[4778]: TS=$(date +%s) Mar 18 09:04:12 crc kubenswrapper[4778]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 18 09:04:12 crc kubenswrapper[4778]: HAS_LOGGED_INFO=0 Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: log_missing_certs(){ Mar 18 09:04:12 crc kubenswrapper[4778]: CUR_TS=$(date +%s) Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 18 09:04:12 crc kubenswrapper[4778]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 18 09:04:12 crc kubenswrapper[4778]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 18 09:04:12 crc kubenswrapper[4778]: HAS_LOGGED_INFO=1 Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: } Mar 18 09:04:12 crc kubenswrapper[4778]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 18 09:04:12 crc kubenswrapper[4778]: log_missing_certs Mar 18 09:04:12 crc kubenswrapper[4778]: sleep 5 Mar 18 09:04:12 crc kubenswrapper[4778]: done Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 18 09:04:12 crc kubenswrapper[4778]: exec /usr/bin/kube-rbac-proxy \ Mar 18 09:04:12 crc kubenswrapper[4778]: --logtostderr \ Mar 18 09:04:12 crc kubenswrapper[4778]: --secure-listen-address=:9108 \ Mar 18 09:04:12 crc kubenswrapper[4778]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 18 09:04:12 crc kubenswrapper[4778]: --upstream=http://127.0.0.1:29108/ \ Mar 18 09:04:12 crc kubenswrapper[4778]: --tls-private-key-file=${TLS_PK} \ Mar 18 09:04:12 crc kubenswrapper[4778]: --tls-cert-file=${TLS_CERT} Mar 18 09:04:12 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6q8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-7262f_openshift-ovn-kubernetes(19777429-4133-4e70-b2dd-c61c54abdec4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:12 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.734053 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:12 crc kubenswrapper[4778]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:12 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:04:12 crc kubenswrapper[4778]: set +o allexport Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_join_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_join_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_transit_switch_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_transit_switch_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: dns_name_resolver_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "false" == "true" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: persistent_ips_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "true" == "true" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: # This is needed so that converting clusters from GA to TP Mar 18 09:04:12 crc kubenswrapper[4778]: # will rollout control plane pods as well Mar 18 09:04:12 crc kubenswrapper[4778]: network_segmentation_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: multi_network_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "true" == "true" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: multi_network_enabled_flag="--enable-multi-network" Mar 18 09:04:12 crc kubenswrapper[4778]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 18 09:04:12 crc kubenswrapper[4778]: exec /usr/bin/ovnkube \ Mar 18 09:04:12 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:04:12 crc kubenswrapper[4778]: --init-cluster-manager "${K8S_NODE}" \ Mar 18 09:04:12 crc kubenswrapper[4778]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 18 09:04:12 crc kubenswrapper[4778]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 18 09:04:12 crc kubenswrapper[4778]: --metrics-bind-address "127.0.0.1:29108" \ Mar 18 09:04:12 crc kubenswrapper[4778]: --metrics-enable-pprof \ Mar 18 09:04:12 crc kubenswrapper[4778]: --metrics-enable-config-duration \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v4_join_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v6_join_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${dns_name_resolver_enabled_flag} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${persistent_ips_enabled_flag} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${multi_network_enabled_flag} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${network_segmentation_enabled_flag} Mar 18 09:04:12 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6q8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-7262f_openshift-ovn-kubernetes(19777429-4133-4e70-b2dd-c61c54abdec4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:12 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.734690 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.735683 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" podUID="19777429-4133-4e70-b2dd-c61c54abdec4" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.748331 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.766034 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.767142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.767207 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.767218 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.767234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.767245 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.783653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.783705 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d88r\" (UniqueName: \"kubernetes.io/projected/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-kube-api-access-9d88r\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.788888 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.801266 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.812691 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.824648 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.834225 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.848399 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.859141 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.869885 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.870608 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.870663 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.870687 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.870717 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.870735 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.877955 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.884982 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.885075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d88r\" (UniqueName: \"kubernetes.io/projected/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-kube-api-access-9d88r\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.885397 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.885516 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:13.385490789 +0000 UTC m=+119.960235619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.889146 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.901270 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.907079 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d88r\" (UniqueName: \"kubernetes.io/projected/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-kube-api-access-9d88r\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.920386 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.938290 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.966743 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.974163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.974238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.974252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.974272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.974286 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.983925 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.004545 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.023319 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.050796 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.077859 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.078123 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.078265 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.078394 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.078502 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.080723 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.092851 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.104414 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.125503 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.139897 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.155710 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.174501 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.181614 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.181683 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.181703 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.181761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.181781 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.189008 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:13 crc kubenswrapper[4778]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 09:04:13 crc kubenswrapper[4778]: set -uo pipefail Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 09:04:13 crc kubenswrapper[4778]: HOSTS_FILE="/etc/hosts" Mar 18 09:04:13 crc kubenswrapper[4778]: TEMP_FILE="/etc/hosts.tmp" Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: # Make a temporary file with the old hosts file's attributes. Mar 18 09:04:13 crc kubenswrapper[4778]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 09:04:13 crc kubenswrapper[4778]: echo "Failed to preserve hosts file. Exiting." Mar 18 09:04:13 crc kubenswrapper[4778]: exit 1 Mar 18 09:04:13 crc kubenswrapper[4778]: fi Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: while true; do Mar 18 09:04:13 crc kubenswrapper[4778]: declare -A svc_ips Mar 18 09:04:13 crc kubenswrapper[4778]: for svc in "${services[@]}"; do Mar 18 09:04:13 crc kubenswrapper[4778]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 09:04:13 crc kubenswrapper[4778]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 09:04:13 crc kubenswrapper[4778]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 09:04:13 crc kubenswrapper[4778]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 09:04:13 crc kubenswrapper[4778]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:04:13 crc kubenswrapper[4778]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:04:13 crc kubenswrapper[4778]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:04:13 crc kubenswrapper[4778]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 09:04:13 crc kubenswrapper[4778]: for i in ${!cmds[*]} Mar 18 09:04:13 crc kubenswrapper[4778]: do Mar 18 09:04:13 crc kubenswrapper[4778]: ips=($(eval "${cmds[i]}")) Mar 18 09:04:13 crc kubenswrapper[4778]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 09:04:13 crc kubenswrapper[4778]: svc_ips["${svc}"]="${ips[@]}" Mar 18 09:04:13 crc kubenswrapper[4778]: break Mar 18 09:04:13 crc kubenswrapper[4778]: fi Mar 18 09:04:13 crc kubenswrapper[4778]: done Mar 18 09:04:13 crc kubenswrapper[4778]: done Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: # Update /etc/hosts only if we get valid service IPs Mar 18 09:04:13 crc kubenswrapper[4778]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 09:04:13 crc kubenswrapper[4778]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 09:04:13 crc kubenswrapper[4778]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 09:04:13 crc kubenswrapper[4778]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 09:04:13 crc kubenswrapper[4778]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 09:04:13 crc kubenswrapper[4778]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 09:04:13 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:04:13 crc kubenswrapper[4778]: continue Mar 18 09:04:13 crc kubenswrapper[4778]: fi Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: # Append resolver entries for services Mar 18 09:04:13 crc kubenswrapper[4778]: rc=0 Mar 18 09:04:13 crc kubenswrapper[4778]: for svc in "${!svc_ips[@]}"; do Mar 18 09:04:13 crc kubenswrapper[4778]: for ip in ${svc_ips[${svc}]}; do Mar 18 09:04:13 crc kubenswrapper[4778]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 09:04:13 crc kubenswrapper[4778]: done Mar 18 09:04:13 crc kubenswrapper[4778]: done Mar 18 09:04:13 crc kubenswrapper[4778]: if [[ $rc -ne 0 ]]; then Mar 18 09:04:13 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:04:13 crc kubenswrapper[4778]: continue Mar 18 09:04:13 crc kubenswrapper[4778]: fi Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 09:04:13 crc kubenswrapper[4778]: # Replace /etc/hosts with our modified version if needed Mar 18 09:04:13 crc kubenswrapper[4778]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 09:04:13 crc kubenswrapper[4778]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 09:04:13 crc kubenswrapper[4778]: fi Mar 18 09:04:13 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:04:13 crc kubenswrapper[4778]: unset svc_ips Mar 18 09:04:13 crc kubenswrapper[4778]: done Mar 18 09:04:13 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f9b2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-dfnnp_openshift-dns(8cf64307-e191-476a-902b-93001adc0b16): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:13 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.189135 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:13 crc kubenswrapper[4778]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 09:04:13 crc kubenswrapper[4778]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 09:04:13 crc kubenswrapper[4778]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2zsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-r2lvf_openshift-multus(dce973f3-25e6-4536-87cc-9b46499ad7cf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:13 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.190699 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-dfnnp" podUID="8cf64307-e191-476a-902b-93001adc0b16" Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.190779 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-r2lvf" podUID="dce973f3-25e6-4536-87cc-9b46499ad7cf" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.193014 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.208155 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.227472 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.241418 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.255977 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.269248 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.284934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.284987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.285001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.285021 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.285033 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.389288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.389364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.389381 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.389408 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.389431 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.392034 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.392436 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.392823 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:14.392793518 +0000 UTC m=+120.967538398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.493084 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.493135 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.493153 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.493177 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.493237 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.536570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.536636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.536656 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.536686 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.536705 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.554015 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.559552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.559626 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.559646 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.559673 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.559692 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.578075 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.583667 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.583889 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.584024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.584286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.584533 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.601615 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.607914 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.607997 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.608017 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.608440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.608473 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.624006 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.629018 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.629069 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.629087 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.629110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.629127 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.646082 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.646364 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.648820 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.648900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.648924 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.648956 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.648982 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.752613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.752678 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.752690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.752715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.752735 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.855056 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.855115 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.855133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.855155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.855171 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.958563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.958604 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.958614 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.958628 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.958637 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.061963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.062029 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.062045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.062104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.062122 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:14Z","lastTransitionTime":"2026-03-18T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.162708 4778 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.186701 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.186888 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.187079 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.187281 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.187393 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.187632 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.187758 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.188227 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.190634 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:14 crc kubenswrapper[4778]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 09:04:14 crc kubenswrapper[4778]: apiVersion: v1 Mar 18 09:04:14 crc kubenswrapper[4778]: clusters: Mar 18 09:04:14 crc kubenswrapper[4778]: - cluster: Mar 18 09:04:14 crc kubenswrapper[4778]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 09:04:14 crc kubenswrapper[4778]: server: https://api-int.crc.testing:6443 Mar 18 09:04:14 crc kubenswrapper[4778]: name: default-cluster Mar 18 09:04:14 crc kubenswrapper[4778]: contexts: Mar 18 09:04:14 crc kubenswrapper[4778]: - context: Mar 18 09:04:14 crc kubenswrapper[4778]: cluster: default-cluster Mar 18 09:04:14 crc kubenswrapper[4778]: namespace: default Mar 18 09:04:14 crc kubenswrapper[4778]: user: default-auth Mar 18 09:04:14 crc kubenswrapper[4778]: name: default-context Mar 18 09:04:14 crc kubenswrapper[4778]: current-context: default-context Mar 18 09:04:14 crc kubenswrapper[4778]: kind: Config Mar 18 09:04:14 crc kubenswrapper[4778]: preferences: {} Mar 18 09:04:14 crc kubenswrapper[4778]: users: Mar 18 09:04:14 crc kubenswrapper[4778]: - name: default-auth Mar 18 09:04:14 crc kubenswrapper[4778]: user: Mar 18 09:04:14 crc kubenswrapper[4778]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 09:04:14 crc kubenswrapper[4778]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 09:04:14 crc kubenswrapper[4778]: EOF Mar 18 09:04:14 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b8g6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:14 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.191738 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.204473 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.219728 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.233396 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.259040 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.277389 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.283969 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.285885 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.294693 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.313752 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.327789 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.338903 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.352866 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.365257 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.380822 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.393782 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.402818 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.404165 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.404307 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.404370 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:16.404352053 +0000 UTC m=+122.979096893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.413382 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.423569 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:15 crc kubenswrapper[4778]: E0318 09:04:15.189220 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cxhf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-xkfx8_openshift-multus(b1698c21-24a7-4338-a0ad-dd110c1ba2f2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:15 crc kubenswrapper[4778]: E0318 09:04:15.189795 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjjcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:15 crc kubenswrapper[4778]: E0318 09:04:15.190806 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" podUID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" Mar 18 09:04:15 crc kubenswrapper[4778]: E0318 09:04:15.192731 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjjcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:15 crc kubenswrapper[4778]: E0318 09:04:15.194017 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:04:16 crc kubenswrapper[4778]: I0318 09:04:16.186321 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:16 crc kubenswrapper[4778]: I0318 09:04:16.186362 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:16 crc kubenswrapper[4778]: E0318 09:04:16.186509 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:16 crc kubenswrapper[4778]: E0318 09:04:16.186891 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:16 crc kubenswrapper[4778]: I0318 09:04:16.187349 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:16 crc kubenswrapper[4778]: I0318 09:04:16.187412 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:16 crc kubenswrapper[4778]: E0318 09:04:16.187773 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:16 crc kubenswrapper[4778]: E0318 09:04:16.187614 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:16 crc kubenswrapper[4778]: I0318 09:04:16.430737 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:16 crc kubenswrapper[4778]: E0318 09:04:16.430931 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:16 crc kubenswrapper[4778]: E0318 09:04:16.431094 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:20.431052308 +0000 UTC m=+127.005797188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:17 crc kubenswrapper[4778]: I0318 09:04:17.913570 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:04:17 crc kubenswrapper[4778]: I0318 09:04:17.932959 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:17 crc kubenswrapper[4778]: I0318 09:04:17.948300 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:17 crc kubenswrapper[4778]: I0318 09:04:17.968581 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:17 crc kubenswrapper[4778]: I0318 09:04:17.989275 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.005426 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.026119 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.039497 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.053426 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.066100 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.078854 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.101059 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.121150 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.134984 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.147123 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.162518 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.173899 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.184929 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.186235 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.186260 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:18 crc kubenswrapper[4778]: E0318 09:04:18.186478 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.186508 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:18 crc kubenswrapper[4778]: E0318 09:04:18.186543 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.186278 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:18 crc kubenswrapper[4778]: E0318 09:04:18.186673 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:18 crc kubenswrapper[4778]: E0318 09:04:18.186839 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:19 crc kubenswrapper[4778]: E0318 09:04:19.285815 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:20 crc kubenswrapper[4778]: I0318 09:04:20.187000 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:20 crc kubenswrapper[4778]: I0318 09:04:20.187130 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:20 crc kubenswrapper[4778]: I0318 09:04:20.187292 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.187350 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:20 crc kubenswrapper[4778]: I0318 09:04:20.187476 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.187699 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.188056 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.188112 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.189032 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:20 crc kubenswrapper[4778]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 09:04:20 crc kubenswrapper[4778]: while [ true ]; Mar 18 09:04:20 crc kubenswrapper[4778]: do Mar 18 09:04:20 crc kubenswrapper[4778]: for f in $(ls /tmp/serviceca); do Mar 18 09:04:20 crc kubenswrapper[4778]: echo $f Mar 18 09:04:20 crc kubenswrapper[4778]: ca_file_path="/tmp/serviceca/${f}" Mar 18 09:04:20 crc kubenswrapper[4778]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 09:04:20 crc kubenswrapper[4778]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 09:04:20 crc kubenswrapper[4778]: if [ -e "${reg_dir_path}" ]; then Mar 18 09:04:20 crc kubenswrapper[4778]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 09:04:20 crc kubenswrapper[4778]: else Mar 18 09:04:20 crc kubenswrapper[4778]: mkdir $reg_dir_path Mar 18 09:04:20 crc kubenswrapper[4778]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 09:04:20 crc kubenswrapper[4778]: fi Mar 18 09:04:20 crc kubenswrapper[4778]: done Mar 18 09:04:20 crc kubenswrapper[4778]: for d in $(ls /etc/docker/certs.d); do Mar 18 09:04:20 crc kubenswrapper[4778]: echo $d Mar 18 09:04:20 crc kubenswrapper[4778]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 09:04:20 crc kubenswrapper[4778]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 09:04:20 crc kubenswrapper[4778]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 09:04:20 crc kubenswrapper[4778]: rm -rf /etc/docker/certs.d/$d Mar 18 09:04:20 crc kubenswrapper[4778]: fi Mar 18 09:04:20 crc kubenswrapper[4778]: done Mar 18 09:04:20 crc kubenswrapper[4778]: sleep 60 & wait ${!} Mar 18 09:04:20 crc kubenswrapper[4778]: done Mar 18 09:04:20 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grrhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-9f2bp_openshift-image-registry(69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:20 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.190239 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-9f2bp" podUID="69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.190524 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:20 crc kubenswrapper[4778]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 09:04:20 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:20 crc kubenswrapper[4778]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 09:04:20 crc kubenswrapper[4778]: source /etc/kubernetes/apiserver-url.env Mar 18 09:04:20 crc kubenswrapper[4778]: else Mar 18 09:04:20 crc kubenswrapper[4778]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 09:04:20 crc kubenswrapper[4778]: exit 1 Mar 18 09:04:20 crc kubenswrapper[4778]: fi Mar 18 09:04:20 crc kubenswrapper[4778]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 09:04:20 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:20 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.190908 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.191701 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.192306 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 09:04:20 crc kubenswrapper[4778]: I0318 09:04:20.475861 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.475901 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.476137 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:28.476115577 +0000 UTC m=+135.050860657 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:22 crc kubenswrapper[4778]: I0318 09:04:22.186474 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:22 crc kubenswrapper[4778]: I0318 09:04:22.186581 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:22 crc kubenswrapper[4778]: I0318 09:04:22.186590 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:22 crc kubenswrapper[4778]: I0318 09:04:22.186729 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.186739 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.186861 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.187341 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.187490 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.189727 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:22 crc kubenswrapper[4778]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:04:22 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:04:22 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:22 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:04:22 crc kubenswrapper[4778]: set +o allexport Mar 18 09:04:22 crc kubenswrapper[4778]: fi Mar 18 09:04:22 crc kubenswrapper[4778]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 09:04:22 crc kubenswrapper[4778]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 09:04:22 crc kubenswrapper[4778]: ho_enable="--enable-hybrid-overlay" Mar 18 09:04:22 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 09:04:22 crc kubenswrapper[4778]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 09:04:22 crc kubenswrapper[4778]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 09:04:22 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:04:22 crc kubenswrapper[4778]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 09:04:22 crc kubenswrapper[4778]: --webhook-host=127.0.0.1 \ Mar 18 09:04:22 crc kubenswrapper[4778]: --webhook-port=9743 \ Mar 18 09:04:22 crc kubenswrapper[4778]: ${ho_enable} \ Mar 18 09:04:22 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:04:22 crc kubenswrapper[4778]: --disable-approver \ Mar 18 09:04:22 crc kubenswrapper[4778]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 09:04:22 crc kubenswrapper[4778]: --wait-for-kubernetes-api=200s \ Mar 18 09:04:22 crc kubenswrapper[4778]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 09:04:22 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:04:22 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:22 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.192989 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:22 crc kubenswrapper[4778]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:04:22 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:04:22 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:22 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:04:22 crc kubenswrapper[4778]: set +o allexport Mar 18 09:04:22 crc kubenswrapper[4778]: fi Mar 18 09:04:22 crc kubenswrapper[4778]: Mar 18 09:04:22 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 09:04:22 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:04:22 crc kubenswrapper[4778]: --disable-webhook \ Mar 18 09:04:22 crc kubenswrapper[4778]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 09:04:22 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:04:22 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:22 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.194376 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.865149 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.865241 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.865265 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.865295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.865317 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:23Z","lastTransitionTime":"2026-03-18T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:23 crc kubenswrapper[4778]: E0318 09:04:23.881779 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.887837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.887895 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.887912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.887936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.887956 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:23Z","lastTransitionTime":"2026-03-18T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:23 crc kubenswrapper[4778]: E0318 09:04:23.907840 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.908156 4778 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.912982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.913039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.913059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.913085 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.913105 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:23Z","lastTransitionTime":"2026-03-18T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:23 crc kubenswrapper[4778]: E0318 09:04:23.930679 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.935507 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.935562 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.935580 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.935604 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.935625 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:23Z","lastTransitionTime":"2026-03-18T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:23 crc kubenswrapper[4778]: E0318 09:04:23.951038 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.956271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.956334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.956348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.956366 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.956378 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:23Z","lastTransitionTime":"2026-03-18T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:23 crc kubenswrapper[4778]: E0318 09:04:23.970951 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:23 crc kubenswrapper[4778]: E0318 09:04:23.971343 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.187090 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.187148 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.187303 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:24 crc kubenswrapper[4778]: E0318 09:04:24.187505 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:24 crc kubenswrapper[4778]: E0318 09:04:24.187745 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.187798 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:24 crc kubenswrapper[4778]: E0318 09:04:24.187876 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:24 crc kubenswrapper[4778]: E0318 09:04:24.188146 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.202754 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.219173 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.238382 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.266101 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: E0318 09:04:24.286675 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.289228 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.320814 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.337603 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.347150 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.362234 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.372956 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.411111 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.442442 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.459886 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.474001 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.488020 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.500796 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.508189 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.186496 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.186566 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.186495 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:26 crc kubenswrapper[4778]: E0318 09:04:26.186742 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.186812 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:26 crc kubenswrapper[4778]: E0318 09:04:26.187481 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:26 crc kubenswrapper[4778]: E0318 09:04:26.187628 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:26 crc kubenswrapper[4778]: E0318 09:04:26.187752 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.781749 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dfnnp" event={"ID":"8cf64307-e191-476a-902b-93001adc0b16","Type":"ContainerStarted","Data":"f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627"} Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.783958 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerStarted","Data":"27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5"} Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.797937 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.809791 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.819111 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.829461 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.848750 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.871498 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.897762 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.914138 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.922327 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.937022 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.947285 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.956871 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.971603 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.984251 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.000088 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.011620 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.024307 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.036331 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.051398 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.064576 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.075857 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.091263 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.102264 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.111987 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.127261 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.143442 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.164088 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.176748 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.212683 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.230516 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.240488 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.253409 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.262249 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.271545 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.789267 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88"} Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.789331 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc"} Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.793174 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" event={"ID":"19777429-4133-4e70-b2dd-c61c54abdec4","Type":"ContainerStarted","Data":"33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee"} Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.793229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" event={"ID":"19777429-4133-4e70-b2dd-c61c54abdec4","Type":"ContainerStarted","Data":"b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888"} Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.795349 4778 generic.go:334] "Generic (PLEG): container finished" podID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" containerID="b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8" exitCode=0 Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.795381 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerDied","Data":"b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8"} Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.799785 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.806973 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.816951 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.826670 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.844680 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.856643 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.885412 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.894851 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.904917 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.916803 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.924174 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.932429 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.939895 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.951355 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.963412 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.976175 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.989596 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.002959 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.012924 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.030514 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.042128 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.053348 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.062138 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.069622 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.085036 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.096089 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.114435 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.133516 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.153724 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.165343 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.172671 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.185561 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.186946 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.186970 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.187014 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.187091 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:28 crc kubenswrapper[4778]: E0318 09:04:28.187092 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:28 crc kubenswrapper[4778]: E0318 09:04:28.187217 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:28 crc kubenswrapper[4778]: E0318 09:04:28.187289 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:28 crc kubenswrapper[4778]: E0318 09:04:28.187669 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.200705 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.212681 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.566342 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:28 crc kubenswrapper[4778]: E0318 09:04:28.566609 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:28 crc kubenswrapper[4778]: E0318 09:04:28.566894 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:44.566870431 +0000 UTC m=+151.141615271 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.802813 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" exitCode=0 Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.802963 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107"} Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.810422 4778 generic.go:334] "Generic (PLEG): container finished" podID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" containerID="3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d" exitCode=0 Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.810485 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerDied","Data":"3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d"} Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.824365 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.840992 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.857919 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.872889 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.884699 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.896416 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.906137 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.923642 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.935743 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.947938 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.960429 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.970260 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.984892 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.995901 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.006749 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.027958 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.041753 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.060791 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.078001 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.089434 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.103220 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.115252 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.126517 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.137852 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.151771 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.165704 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.177649 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.201679 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.214006 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.226235 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.248034 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.264820 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: E0318 09:04:29.289889 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.291699 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.308055 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.820155 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5"} Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.820541 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0"} Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.820560 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f"} Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.824831 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerDied","Data":"7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917"} Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.824834 4778 generic.go:334] "Generic (PLEG): container finished" podID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" containerID="7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917" exitCode=0 Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.843697 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.860677 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.874903 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.888108 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.901733 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.914085 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.925381 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.936508 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.949621 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.965556 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.976383 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.993834 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.005698 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.015143 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.027962 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.042042 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.053751 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.186829 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.186887 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:30 crc kubenswrapper[4778]: E0318 09:04:30.187009 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.187478 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.187604 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:30 crc kubenswrapper[4778]: E0318 09:04:30.187729 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:30 crc kubenswrapper[4778]: E0318 09:04:30.187824 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:30 crc kubenswrapper[4778]: E0318 09:04:30.187908 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.833123 4778 generic.go:334] "Generic (PLEG): container finished" podID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" containerID="763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10" exitCode=0 Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.833235 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerDied","Data":"763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10"} Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.839356 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147"} Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.839537 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3"} Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.839605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b"} Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.854339 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.867761 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.877516 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.891097 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.906158 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.916892 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.931045 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.942641 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.958696 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.971601 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.982623 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.992387 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.002283 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.013163 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.024653 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.035719 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.055685 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.852999 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5"} Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.858614 4778 generic.go:334] "Generic (PLEG): container finished" podID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" containerID="76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152" exitCode=0 Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.858686 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerDied","Data":"76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152"} Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.868295 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.878736 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.896802 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.905057 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.914499 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.931797 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.942986 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.957603 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.967120 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.973786 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.984702 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.995465 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.004932 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.019185 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.034379 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.049541 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.059053 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.069450 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.095678 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.108320 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.115428 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.127338 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.134386 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.144966 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.157160 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.167701 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.181070 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.186113 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.186273 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.186324 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:32 crc kubenswrapper[4778]: E0318 09:04:32.186429 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.186835 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:32 crc kubenswrapper[4778]: E0318 09:04:32.186902 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:32 crc kubenswrapper[4778]: E0318 09:04:32.186983 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:32 crc kubenswrapper[4778]: E0318 09:04:32.187067 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.191947 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.200863 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.208810 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.218799 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.228053 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.237455 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.261602 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.868058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f"} Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.874640 4778 generic.go:334] "Generic (PLEG): container finished" podID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" containerID="b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792" exitCode=0 Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.874673 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerDied","Data":"b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792"} Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.886155 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.898350 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.911338 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.936184 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.960094 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.971697 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.979078 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.989535 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.000429 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.009279 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.033663 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.060594 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.095608 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.114854 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.131644 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.144697 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.162701 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.883850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerStarted","Data":"bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb"} Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.888008 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc"} Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.888040 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18"} Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.907376 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.919155 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.937908 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.958947 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.971490 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.985386 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.998130 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.022597 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.036882 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.037217 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:05:38.03715355 +0000 UTC m=+204.611898390 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.046383 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.065635 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.092998 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.110587 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.129301 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.138853 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.138911 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.138958 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.138985 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139076 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:05:38.139050201 +0000 UTC m=+204.713795071 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.138996 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139105 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139303 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139358 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139382 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139317 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139460 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139482 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139331 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:05:38.139302888 +0000 UTC m=+204.714047768 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139553 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:05:38.139538304 +0000 UTC m=+204.714283184 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139593 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:05:38.139577365 +0000 UTC m=+204.714322355 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.146119 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.168328 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.187276 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.187384 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.187307 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.187508 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.187625 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.187779 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.187910 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.188104 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.199921 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.214906 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.229566 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.245851 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.270455 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.291493 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.305625 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.309479 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.309524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.309543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.309563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.309577 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:34Z","lastTransitionTime":"2026-03-18T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.329100 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.334726 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.334762 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.334774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.334790 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.334802 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:34Z","lastTransitionTime":"2026-03-18T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.341717 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.352514 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.357327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.357362 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.357371 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.357385 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.357394 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:34Z","lastTransitionTime":"2026-03-18T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.363332 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.374877 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.378333 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.379963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.380027 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.380045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.380071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.380089 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:34Z","lastTransitionTime":"2026-03-18T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.395380 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.399705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.399732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.399740 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.399755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.399806 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:34Z","lastTransitionTime":"2026-03-18T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.403872 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.414742 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.414911 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.416033 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.430660 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.452416 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.472091 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.491956 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.513547 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.528011 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.542743 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.556506 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.570945 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.582078 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.598116 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.614628 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.629993 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.659798 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.690018 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.703498 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.715923 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.732800 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.743897 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.755844 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.770187 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.782726 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.797177 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.810952 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.824480 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.898332 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b"} Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.898636 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.917628 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.931792 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.950176 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.956471 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.979322 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.996753 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.018495 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.033472 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.048539 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.067277 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.081920 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.098921 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.113890 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.129907 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.143580 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.156057 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.169056 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.181353 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.197526 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.222966 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.239113 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.254820 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.268376 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.284597 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.298275 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.314498 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.355905 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.395473 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.439252 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.483282 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.513982 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.555786 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.622873 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.661810 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.675960 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.904145 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64"} Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.906663 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9f2bp" event={"ID":"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7","Type":"ContainerStarted","Data":"cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7"} Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.907514 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.907568 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.928016 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.938756 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.953294 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.964422 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.978975 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.991644 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.005016 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.022773 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.043773 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.068769 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.086912 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.119175 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.156043 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.186624 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.186675 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.186711 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:36 crc kubenswrapper[4778]: E0318 09:04:36.186766 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.186922 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:36 crc kubenswrapper[4778]: E0318 09:04:36.186915 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:36 crc kubenswrapper[4778]: E0318 09:04:36.187082 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:36 crc kubenswrapper[4778]: E0318 09:04:36.187188 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.194534 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.238050 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.275469 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.320116 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.400425 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.411287 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.443026 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.479681 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.514093 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.564055 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.592960 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.635598 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.672797 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.716495 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.755287 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.793335 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.838547 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.875308 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.914869 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.952282 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.997063 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.042046 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:37Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.921921 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/0.log" Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.926640 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b" exitCode=1 Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.926709 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b"} Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.927360 4778 scope.go:117] "RemoveContainer" containerID="49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b" Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.951526 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:37Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.968826 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:37Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.981708 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:37Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.001356 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:37Z\\\",\\\"message\\\":\\\"78 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:37.028846 6778 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 09:04:37.029891 6778 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:37.029908 6778 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:37.028856 6778 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:37.020307 6778 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 09:04:37.030813 6778 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:37.030837 6778 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:37.030884 6778 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:37.030942 6778 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:37.030971 6778 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:37.030942 6778 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:37.031009 6778 factory.go:656] Stopping watch factory\\\\nI0318 09:04:37.031025 6778 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:37.031023 6778 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:37Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.020859 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.039090 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.056675 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.074761 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.088539 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.104876 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.122344 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.136585 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.158639 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.177422 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.187101 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.187114 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.187158 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:38 crc kubenswrapper[4778]: E0318 09:04:38.187277 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.187437 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:38 crc kubenswrapper[4778]: E0318 09:04:38.187423 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:38 crc kubenswrapper[4778]: E0318 09:04:38.187542 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:38 crc kubenswrapper[4778]: E0318 09:04:38.187674 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.191418 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.207098 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.225746 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.934630 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/0.log" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.939555 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b"} Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.940379 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.962894 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.986261 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.001265 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.015697 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.036766 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.051988 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.068311 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.082185 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.094938 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.107047 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.119380 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.132340 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.143904 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.166921 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:37Z\\\",\\\"message\\\":\\\"78 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:37.028846 6778 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 09:04:37.029891 6778 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:37.029908 6778 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:37.028856 6778 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:37.020307 6778 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 09:04:37.030813 6778 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:37.030837 6778 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:37.030884 6778 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:37.030942 6778 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:37.030971 6778 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:37.030942 6778 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:37.031009 6778 factory.go:656] Stopping watch factory\\\\nI0318 09:04:37.031025 6778 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:37.031023 6778 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.179841 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.191479 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.204539 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: E0318 09:04:39.293607 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.946803 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/1.log" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.948089 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/0.log" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.953704 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b" exitCode=1 Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.953785 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b"} Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.953848 4778 scope.go:117] "RemoveContainer" containerID="49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.955572 4778 scope.go:117] "RemoveContainer" containerID="7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b" Mar 18 09:04:39 crc kubenswrapper[4778]: E0318 09:04:39.955865 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.972357 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.988382 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.011429 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.027380 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.046091 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.081810 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.098694 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.121566 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.141290 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.159813 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.176792 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.186121 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.186185 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:40 crc kubenswrapper[4778]: E0318 09:04:40.186280 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.186121 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:40 crc kubenswrapper[4778]: E0318 09:04:40.186380 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:40 crc kubenswrapper[4778]: E0318 09:04:40.186463 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.186598 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:40 crc kubenswrapper[4778]: E0318 09:04:40.186668 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.191304 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.204743 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.219767 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.261031 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:37Z\\\",\\\"message\\\":\\\"78 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:37.028846 6778 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 09:04:37.029891 6778 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:37.029908 6778 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:37.028856 6778 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:37.020307 6778 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 09:04:37.030813 6778 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:37.030837 6778 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:37.030884 6778 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:37.030942 6778 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:37.030971 6778 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:37.030942 6778 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:37.031009 6778 factory.go:656] Stopping watch factory\\\\nI0318 09:04:37.031025 6778 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:37.031023 6778 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:38Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0318 09:04:38.953123 6929 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 09:04:38.953789 6929 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:38.953820 6929 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:38.953837 6929 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:38.955533 6929 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:38.955636 6929 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:38.955712 6929 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:38.955811 6929 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:38.955825 6929 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:38.955855 6929 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:38.955888 6929 factory.go:656] Stopping watch factory\\\\nI0318 09:04:38.955909 6929 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:38.955921 6929 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:38.956003 6929 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:38.956027 6929 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 09:04:38.956134 6929 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.279925 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.299982 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.960135 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/1.log" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.963397 4778 scope.go:117] "RemoveContainer" containerID="7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b" Mar 18 09:04:40 crc kubenswrapper[4778]: E0318 09:04:40.963544 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.978501 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.993748 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.008696 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.023527 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.040251 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.052670 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.064754 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.077916 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.096731 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.117752 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:38Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0318 09:04:38.953123 6929 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 09:04:38.953789 6929 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:38.953820 6929 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:38.953837 6929 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:38.955533 6929 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:38.955636 6929 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:38.955712 6929 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:38.955811 6929 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:38.955825 6929 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:38.955855 6929 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:38.955888 6929 factory.go:656] Stopping watch factory\\\\nI0318 09:04:38.955909 6929 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:38.955921 6929 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:38.956003 6929 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:38.956027 6929 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 09:04:38.956134 6929 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.130574 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.158643 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.175598 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.188103 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.207685 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.220120 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.233232 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:42 crc kubenswrapper[4778]: I0318 09:04:42.186270 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:42 crc kubenswrapper[4778]: I0318 09:04:42.186406 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:42 crc kubenswrapper[4778]: E0318 09:04:42.187008 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:42 crc kubenswrapper[4778]: I0318 09:04:42.186441 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:42 crc kubenswrapper[4778]: I0318 09:04:42.186412 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:42 crc kubenswrapper[4778]: E0318 09:04:42.187131 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:42 crc kubenswrapper[4778]: E0318 09:04:42.187276 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:42 crc kubenswrapper[4778]: E0318 09:04:42.187465 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:43 crc kubenswrapper[4778]: I0318 09:04:43.204007 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.186781 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.186852 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.186918 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.186944 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.186985 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.187141 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.187559 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.187897 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.207771 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.226807 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.245737 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.276425 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:38Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0318 09:04:38.953123 6929 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 09:04:38.953789 6929 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:38.953820 6929 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:38.953837 6929 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:38.955533 6929 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:38.955636 6929 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:38.955712 6929 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:38.955811 6929 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:38.955825 6929 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:38.955855 6929 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:38.955888 6929 factory.go:656] Stopping watch factory\\\\nI0318 09:04:38.955909 6929 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:38.955921 6929 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:38.956003 6929 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:38.956027 6929 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 09:04:38.956134 6929 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.291593 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.294184 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.326395 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.343613 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.355243 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.368014 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.381602 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.399038 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.418715 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.435027 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.452706 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.469719 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.485442 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.498888 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.510545 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.579820 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.580023 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.580142 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:05:16.580108806 +0000 UTC m=+183.154853686 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.812702 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.812815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.812833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.813394 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.813565 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:44Z","lastTransitionTime":"2026-03-18T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.834590 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.839643 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.839698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.839720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.839751 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.839773 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:44Z","lastTransitionTime":"2026-03-18T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.861341 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.866510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.866567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.866589 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.866621 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.866644 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:44Z","lastTransitionTime":"2026-03-18T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.893802 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.900241 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.900295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.900314 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.900340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.900357 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:44Z","lastTransitionTime":"2026-03-18T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.915289 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.920384 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.920453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.920482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.920515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.920537 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:44Z","lastTransitionTime":"2026-03-18T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.937595 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.937743 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:04:46 crc kubenswrapper[4778]: I0318 09:04:46.186387 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:46 crc kubenswrapper[4778]: I0318 09:04:46.186488 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:46 crc kubenswrapper[4778]: I0318 09:04:46.186491 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:46 crc kubenswrapper[4778]: E0318 09:04:46.186575 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:46 crc kubenswrapper[4778]: E0318 09:04:46.186738 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:46 crc kubenswrapper[4778]: I0318 09:04:46.186828 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:46 crc kubenswrapper[4778]: E0318 09:04:46.186929 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:46 crc kubenswrapper[4778]: E0318 09:04:46.187014 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:48 crc kubenswrapper[4778]: I0318 09:04:48.186170 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:48 crc kubenswrapper[4778]: I0318 09:04:48.186251 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:48 crc kubenswrapper[4778]: I0318 09:04:48.186311 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:48 crc kubenswrapper[4778]: I0318 09:04:48.186335 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:48 crc kubenswrapper[4778]: E0318 09:04:48.187546 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:48 crc kubenswrapper[4778]: E0318 09:04:48.187675 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:48 crc kubenswrapper[4778]: E0318 09:04:48.187843 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:48 crc kubenswrapper[4778]: E0318 09:04:48.187995 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:49 crc kubenswrapper[4778]: I0318 09:04:49.201522 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 09:04:49 crc kubenswrapper[4778]: E0318 09:04:49.295769 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:50 crc kubenswrapper[4778]: I0318 09:04:50.186929 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:50 crc kubenswrapper[4778]: I0318 09:04:50.187067 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:50 crc kubenswrapper[4778]: I0318 09:04:50.187088 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:50 crc kubenswrapper[4778]: E0318 09:04:50.187297 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:50 crc kubenswrapper[4778]: I0318 09:04:50.187334 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:50 crc kubenswrapper[4778]: E0318 09:04:50.187463 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:50 crc kubenswrapper[4778]: E0318 09:04:50.187603 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:50 crc kubenswrapper[4778]: E0318 09:04:50.187714 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:52 crc kubenswrapper[4778]: I0318 09:04:52.186934 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:52 crc kubenswrapper[4778]: I0318 09:04:52.187005 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:52 crc kubenswrapper[4778]: E0318 09:04:52.187170 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:52 crc kubenswrapper[4778]: I0318 09:04:52.187005 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:52 crc kubenswrapper[4778]: E0318 09:04:52.187364 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:52 crc kubenswrapper[4778]: E0318 09:04:52.187245 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:52 crc kubenswrapper[4778]: I0318 09:04:52.186998 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:52 crc kubenswrapper[4778]: E0318 09:04:52.187479 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.186106 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.186229 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.186313 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.186421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:54 crc kubenswrapper[4778]: E0318 09:04:54.186419 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:54 crc kubenswrapper[4778]: E0318 09:04:54.186735 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:54 crc kubenswrapper[4778]: E0318 09:04:54.186908 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:54 crc kubenswrapper[4778]: E0318 09:04:54.187036 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.208361 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:38Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0318 09:04:38.953123 6929 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 09:04:38.953789 6929 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:38.953820 6929 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:38.953837 6929 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:38.955533 6929 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:38.955636 6929 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:38.955712 6929 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:38.955811 6929 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:38.955825 6929 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:38.955855 6929 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:38.955888 6929 factory.go:656] Stopping watch factory\\\\nI0318 09:04:38.955909 6929 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:38.955921 6929 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:38.956003 6929 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:38.956027 6929 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 09:04:38.956134 6929 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.224588 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.239275 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.252566 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.262760 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.284388 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: E0318 09:04:54.296597 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.300972 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.316257 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.330314 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.355142 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.367457 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.381330 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.393535 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.405309 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.418107 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.433010 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.446049 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.458681 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.472592 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.073295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.073714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.073948 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.074437 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.074855 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:55Z","lastTransitionTime":"2026-03-18T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:55 crc kubenswrapper[4778]: E0318 09:04:55.094775 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:55Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.099934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.099984 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.099998 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.100022 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.100039 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:55Z","lastTransitionTime":"2026-03-18T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:55 crc kubenswrapper[4778]: E0318 09:04:55.112909 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:55Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.117937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.117972 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.117983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.117997 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.118031 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:55Z","lastTransitionTime":"2026-03-18T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:55 crc kubenswrapper[4778]: E0318 09:04:55.130643 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:55Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.135829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.136074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.136281 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.136443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.137027 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:55Z","lastTransitionTime":"2026-03-18T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:55 crc kubenswrapper[4778]: E0318 09:04:55.156937 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:55Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.162658 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.162921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.163064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.163240 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.163400 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:55Z","lastTransitionTime":"2026-03-18T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:55 crc kubenswrapper[4778]: E0318 09:04:55.180259 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:55Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:55 crc kubenswrapper[4778]: E0318 09:04:55.180905 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:04:56 crc kubenswrapper[4778]: I0318 09:04:56.186756 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:56 crc kubenswrapper[4778]: I0318 09:04:56.186810 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:56 crc kubenswrapper[4778]: I0318 09:04:56.187180 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:56 crc kubenswrapper[4778]: I0318 09:04:56.187228 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:56 crc kubenswrapper[4778]: E0318 09:04:56.187403 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:56 crc kubenswrapper[4778]: I0318 09:04:56.187515 4778 scope.go:117] "RemoveContainer" containerID="7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b" Mar 18 09:04:56 crc kubenswrapper[4778]: E0318 09:04:56.187618 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:56 crc kubenswrapper[4778]: E0318 09:04:56.187764 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:56 crc kubenswrapper[4778]: E0318 09:04:56.187504 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.025938 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/1.log" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.030230 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80"} Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.031043 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.056500 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.075779 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.091552 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.120005 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.136740 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.154784 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.171992 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.189129 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.204358 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.221651 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.238179 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.252981 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.269566 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.282178 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.295900 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.309188 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.321739 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.342387 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:38Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0318 09:04:38.953123 6929 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 09:04:38.953789 6929 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:38.953820 6929 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:38.953837 6929 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:38.955533 6929 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:38.955636 6929 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:38.955712 6929 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:38.955811 6929 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:38.955825 6929 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:38.955855 6929 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:38.955888 6929 factory.go:656] Stopping watch factory\\\\nI0318 09:04:38.955909 6929 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:38.955921 6929 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:38.956003 6929 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:38.956027 6929 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 09:04:38.956134 6929 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.355714 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.037830 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/2.log" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.039045 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/1.log" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.043530 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80" exitCode=1 Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.043576 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80"} Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.043616 4778 scope.go:117] "RemoveContainer" containerID="7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.044861 4778 scope.go:117] "RemoveContainer" containerID="355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80" Mar 18 09:04:58 crc kubenswrapper[4778]: E0318 09:04:58.045160 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.066828 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.088028 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.110267 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.145129 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:38Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0318 09:04:38.953123 6929 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 09:04:38.953789 6929 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:38.953820 6929 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:38.953837 6929 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:38.955533 6929 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:38.955636 6929 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:38.955712 6929 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:38.955811 6929 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:38.955825 6929 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:38.955855 6929 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:38.955888 6929 factory.go:656] Stopping watch factory\\\\nI0318 09:04:38.955909 6929 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:38.955921 6929 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:38.956003 6929 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:38.956027 6929 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 09:04:38.956134 6929 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.171821 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.186730 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.186847 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:58 crc kubenswrapper[4778]: E0318 09:04:58.186921 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.186847 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:58 crc kubenswrapper[4778]: E0318 09:04:58.187076 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.187160 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:58 crc kubenswrapper[4778]: E0318 09:04:58.187302 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:58 crc kubenswrapper[4778]: E0318 09:04:58.187374 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.189556 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.209164 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.233007 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.268400 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.284810 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.299731 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.320036 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.337170 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.356354 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.374323 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.392410 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.408413 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.425796 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.438680 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.050828 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/2.log" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.057595 4778 scope.go:117] "RemoveContainer" containerID="355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80" Mar 18 09:04:59 crc kubenswrapper[4778]: E0318 09:04:59.058005 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.075448 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.090416 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.106616 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.124997 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.142996 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.164147 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.179706 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.195807 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.206915 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.233917 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.253345 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.266608 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.286498 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: E0318 09:04:59.298136 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.301362 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.314558 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.326059 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.337734 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.350672 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.363457 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:00 crc kubenswrapper[4778]: I0318 09:05:00.187245 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:00 crc kubenswrapper[4778]: I0318 09:05:00.187289 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:00 crc kubenswrapper[4778]: I0318 09:05:00.187285 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:00 crc kubenswrapper[4778]: E0318 09:05:00.187478 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:00 crc kubenswrapper[4778]: E0318 09:05:00.187593 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:00 crc kubenswrapper[4778]: E0318 09:05:00.187697 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:00 crc kubenswrapper[4778]: I0318 09:05:00.187857 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:00 crc kubenswrapper[4778]: E0318 09:05:00.187974 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:02 crc kubenswrapper[4778]: I0318 09:05:02.186632 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:02 crc kubenswrapper[4778]: I0318 09:05:02.186701 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:02 crc kubenswrapper[4778]: E0318 09:05:02.186861 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:02 crc kubenswrapper[4778]: I0318 09:05:02.186887 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:02 crc kubenswrapper[4778]: I0318 09:05:02.186886 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:02 crc kubenswrapper[4778]: E0318 09:05:02.187074 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:02 crc kubenswrapper[4778]: E0318 09:05:02.187166 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:02 crc kubenswrapper[4778]: E0318 09:05:02.187422 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.187121 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:04 crc kubenswrapper[4778]: E0318 09:05:04.187339 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.187441 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.187441 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:04 crc kubenswrapper[4778]: E0318 09:05:04.187617 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:04 crc kubenswrapper[4778]: E0318 09:05:04.187839 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.188689 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:04 crc kubenswrapper[4778]: E0318 09:05:04.189007 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.208542 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.231994 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.250876 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.267799 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.288642 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: E0318 09:05:04.298910 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.308310 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.327082 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.339053 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.353283 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.381517 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.398372 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.412478 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.427271 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.443172 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.459928 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.471457 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.484834 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.501045 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.533020 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.283010 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.283068 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.283080 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.283101 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.283113 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:05Z","lastTransitionTime":"2026-03-18T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:05 crc kubenswrapper[4778]: E0318 09:05:05.306015 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:05Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.312688 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.312750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.312770 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.312796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.312814 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:05Z","lastTransitionTime":"2026-03-18T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:05 crc kubenswrapper[4778]: E0318 09:05:05.327646 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:05Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.332330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.332376 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.332388 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.332399 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.332407 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:05Z","lastTransitionTime":"2026-03-18T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:05 crc kubenswrapper[4778]: E0318 09:05:05.345632 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:05Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.349243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.349298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.349317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.349341 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.349361 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:05Z","lastTransitionTime":"2026-03-18T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:05 crc kubenswrapper[4778]: E0318 09:05:05.363241 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:05Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.368925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.368973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.368984 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.369003 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.369016 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:05Z","lastTransitionTime":"2026-03-18T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:05 crc kubenswrapper[4778]: E0318 09:05:05.390048 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:05Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:05 crc kubenswrapper[4778]: E0318 09:05:05.390439 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:05:06 crc kubenswrapper[4778]: I0318 09:05:06.186513 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:06 crc kubenswrapper[4778]: E0318 09:05:06.186663 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:06 crc kubenswrapper[4778]: I0318 09:05:06.186764 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:06 crc kubenswrapper[4778]: E0318 09:05:06.186898 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:06 crc kubenswrapper[4778]: I0318 09:05:06.187108 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:06 crc kubenswrapper[4778]: I0318 09:05:06.187136 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:06 crc kubenswrapper[4778]: E0318 09:05:06.187234 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:06 crc kubenswrapper[4778]: E0318 09:05:06.187388 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:08 crc kubenswrapper[4778]: I0318 09:05:08.186610 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:08 crc kubenswrapper[4778]: I0318 09:05:08.186735 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:08 crc kubenswrapper[4778]: E0318 09:05:08.186782 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:08 crc kubenswrapper[4778]: I0318 09:05:08.186836 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:08 crc kubenswrapper[4778]: E0318 09:05:08.186943 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:08 crc kubenswrapper[4778]: I0318 09:05:08.186856 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:08 crc kubenswrapper[4778]: E0318 09:05:08.187080 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:08 crc kubenswrapper[4778]: E0318 09:05:08.187288 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:09 crc kubenswrapper[4778]: E0318 09:05:09.300067 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:10 crc kubenswrapper[4778]: I0318 09:05:10.186736 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:10 crc kubenswrapper[4778]: I0318 09:05:10.186762 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:10 crc kubenswrapper[4778]: I0318 09:05:10.186835 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:10 crc kubenswrapper[4778]: I0318 09:05:10.186890 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:10 crc kubenswrapper[4778]: E0318 09:05:10.187149 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:10 crc kubenswrapper[4778]: E0318 09:05:10.187342 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:10 crc kubenswrapper[4778]: E0318 09:05:10.187534 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:10 crc kubenswrapper[4778]: E0318 09:05:10.187741 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:12 crc kubenswrapper[4778]: I0318 09:05:12.186879 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:12 crc kubenswrapper[4778]: I0318 09:05:12.186949 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:12 crc kubenswrapper[4778]: I0318 09:05:12.187048 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:12 crc kubenswrapper[4778]: E0318 09:05:12.187103 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:12 crc kubenswrapper[4778]: E0318 09:05:12.187309 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:12 crc kubenswrapper[4778]: I0318 09:05:12.187328 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:12 crc kubenswrapper[4778]: E0318 09:05:12.187438 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:12 crc kubenswrapper[4778]: E0318 09:05:12.187753 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.114322 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/0.log" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.114424 4778 generic.go:334] "Generic (PLEG): container finished" podID="dce973f3-25e6-4536-87cc-9b46499ad7cf" containerID="27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5" exitCode=1 Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.114480 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerDied","Data":"27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5"} Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.115244 4778 scope.go:117] "RemoveContainer" containerID="27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.145862 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.159818 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.175346 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.186369 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.186432 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:14 crc kubenswrapper[4778]: E0318 09:05:14.186526 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.186554 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:14 crc kubenswrapper[4778]: E0318 09:05:14.186758 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:14 crc kubenswrapper[4778]: E0318 09:05:14.186823 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.187817 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.188011 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:14 crc kubenswrapper[4778]: E0318 09:05:14.188111 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.188469 4778 scope.go:117] "RemoveContainer" containerID="355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80" Mar 18 09:05:14 crc kubenswrapper[4778]: E0318 09:05:14.188778 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.202857 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.221210 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.237568 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.250717 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.265475 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.280690 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: E0318 09:05:14.300664 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.303268 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.317985 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.329801 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.347655 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.361306 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.377289 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.391129 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.436735 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.478382 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.491376 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.514420 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.528255 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.539725 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.559628 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.570320 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.582411 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.595625 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.610795 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.625052 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.646554 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.673744 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.687762 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.704381 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.717967 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.731734 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.749132 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.766925 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.790962 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.122898 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/0.log" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.123398 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerStarted","Data":"3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562"} Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.151380 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.173418 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.197892 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.231043 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.248820 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.269708 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.287429 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.322994 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.344799 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.364470 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.390571 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.412630 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.438076 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.461007 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.481789 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.505628 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.529742 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.551324 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.571492 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.730718 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.730968 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.731094 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.731128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.731147 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:15Z","lastTransitionTime":"2026-03-18T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:15 crc kubenswrapper[4778]: E0318 09:05:15.755644 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.760638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.760721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.760741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.760781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.760852 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:15Z","lastTransitionTime":"2026-03-18T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:15 crc kubenswrapper[4778]: E0318 09:05:15.784117 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.790443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.790496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.790509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.790527 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.790540 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:15Z","lastTransitionTime":"2026-03-18T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:15 crc kubenswrapper[4778]: E0318 09:05:15.810765 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.816486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.816541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.816553 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.816573 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.816588 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:15Z","lastTransitionTime":"2026-03-18T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:15 crc kubenswrapper[4778]: E0318 09:05:15.835671 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.840329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.840389 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.840409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.840432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.840450 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:15Z","lastTransitionTime":"2026-03-18T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:15 crc kubenswrapper[4778]: E0318 09:05:15.855702 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: E0318 09:05:15.855948 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:05:16 crc kubenswrapper[4778]: I0318 09:05:16.187041 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:16 crc kubenswrapper[4778]: I0318 09:05:16.187131 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:16 crc kubenswrapper[4778]: I0318 09:05:16.187247 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:16 crc kubenswrapper[4778]: E0318 09:05:16.187297 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:16 crc kubenswrapper[4778]: I0318 09:05:16.187329 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:16 crc kubenswrapper[4778]: E0318 09:05:16.187645 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:16 crc kubenswrapper[4778]: E0318 09:05:16.187566 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:16 crc kubenswrapper[4778]: E0318 09:05:16.188347 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:16 crc kubenswrapper[4778]: I0318 09:05:16.654472 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:16 crc kubenswrapper[4778]: E0318 09:05:16.654689 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:05:16 crc kubenswrapper[4778]: E0318 09:05:16.654776 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:20.654750949 +0000 UTC m=+247.229495829 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:05:18 crc kubenswrapper[4778]: I0318 09:05:18.187479 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:18 crc kubenswrapper[4778]: I0318 09:05:18.187680 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:18 crc kubenswrapper[4778]: I0318 09:05:18.187746 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:18 crc kubenswrapper[4778]: E0318 09:05:18.187905 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:18 crc kubenswrapper[4778]: I0318 09:05:18.188257 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:18 crc kubenswrapper[4778]: E0318 09:05:18.188593 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:18 crc kubenswrapper[4778]: E0318 09:05:18.188720 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:18 crc kubenswrapper[4778]: E0318 09:05:18.188830 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:19 crc kubenswrapper[4778]: E0318 09:05:19.301803 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:20 crc kubenswrapper[4778]: I0318 09:05:20.187515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:20 crc kubenswrapper[4778]: I0318 09:05:20.187683 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:20 crc kubenswrapper[4778]: I0318 09:05:20.187822 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:20 crc kubenswrapper[4778]: E0318 09:05:20.187829 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:20 crc kubenswrapper[4778]: I0318 09:05:20.187863 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:20 crc kubenswrapper[4778]: E0318 09:05:20.187989 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:20 crc kubenswrapper[4778]: E0318 09:05:20.188107 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:20 crc kubenswrapper[4778]: E0318 09:05:20.188228 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:22 crc kubenswrapper[4778]: I0318 09:05:22.187165 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:22 crc kubenswrapper[4778]: I0318 09:05:22.187248 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:22 crc kubenswrapper[4778]: E0318 09:05:22.187503 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:22 crc kubenswrapper[4778]: I0318 09:05:22.187569 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:22 crc kubenswrapper[4778]: I0318 09:05:22.187595 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:22 crc kubenswrapper[4778]: E0318 09:05:22.187794 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:22 crc kubenswrapper[4778]: E0318 09:05:22.187876 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:22 crc kubenswrapper[4778]: E0318 09:05:22.188179 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.186359 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.186515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.186584 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:24 crc kubenswrapper[4778]: E0318 09:05:24.186749 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.186777 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:24 crc kubenswrapper[4778]: E0318 09:05:24.186951 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:24 crc kubenswrapper[4778]: E0318 09:05:24.187104 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:24 crc kubenswrapper[4778]: E0318 09:05:24.187270 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.207843 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.224681 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.252298 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.267502 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.288530 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: E0318 09:05:24.302595 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.311279 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.328163 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.348045 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.368084 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.386700 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.408650 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.422649 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.435977 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.450483 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.463061 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.481050 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.494876 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.514515 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.534748 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.194992 4778 scope.go:117] "RemoveContainer" containerID="355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.916428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.916875 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.916885 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.916905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.916918 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:25Z","lastTransitionTime":"2026-03-18T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:25 crc kubenswrapper[4778]: E0318 09:05:25.930111 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:25Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.934092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.934167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.934181 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.934240 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.934269 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:25Z","lastTransitionTime":"2026-03-18T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:25 crc kubenswrapper[4778]: E0318 09:05:25.947654 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:25Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.951398 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.951453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.951472 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.951490 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.951502 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:25Z","lastTransitionTime":"2026-03-18T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:25 crc kubenswrapper[4778]: E0318 09:05:25.965184 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:25Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.970070 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.970120 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.970136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.970158 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.970172 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:25Z","lastTransitionTime":"2026-03-18T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:25 crc kubenswrapper[4778]: E0318 09:05:25.983044 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:25Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.986552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.986584 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.986594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.986610 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.986622 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:25Z","lastTransitionTime":"2026-03-18T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:25 crc kubenswrapper[4778]: E0318 09:05:25.997923 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:25Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:25 crc kubenswrapper[4778]: E0318 09:05:25.998060 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.168458 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/2.log" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.172300 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d"} Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.172899 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.187168 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.187294 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.187204 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.187293 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:26 crc kubenswrapper[4778]: E0318 09:05:26.187393 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:26 crc kubenswrapper[4778]: E0318 09:05:26.187514 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:26 crc kubenswrapper[4778]: E0318 09:05:26.187570 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:26 crc kubenswrapper[4778]: E0318 09:05:26.187674 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.193647 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.210852 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.226686 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.245779 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.257762 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.276189 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.290024 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.312915 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.333901 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.349785 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.374800 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.392844 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.415507 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.430910 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.446107 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.460646 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.477072 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.488681 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.503409 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.180478 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/3.log" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.181718 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/2.log" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.185289 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" exitCode=1 Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.185340 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d"} Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.185378 4778 scope.go:117] "RemoveContainer" containerID="355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.186331 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:05:27 crc kubenswrapper[4778]: E0318 09:05:27.186573 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.205401 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.225499 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.246971 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.262008 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.281549 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.297954 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.314572 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.331043 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.350348 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.372136 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.393615 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.427134 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:26Z\\\",\\\"message\\\":\\\"Distribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0318 09:05:26.271548 7451 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-56rc7\\\\nF0318 09:05:26.271567 7451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z]\\\\nI0318 09:05:26.271448 7451 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-9f2bp\\\\nI0318 09:05:26.271585 7451 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-9f2bp in node crc\\\\nI0318 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.442292 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.455906 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.467410 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.491721 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.509599 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.523993 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.542409 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.186656 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.186711 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.186751 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.186668 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:28 crc kubenswrapper[4778]: E0318 09:05:28.186893 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:28 crc kubenswrapper[4778]: E0318 09:05:28.187024 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:28 crc kubenswrapper[4778]: E0318 09:05:28.187177 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:28 crc kubenswrapper[4778]: E0318 09:05:28.187514 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.191780 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/3.log" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.196982 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:05:28 crc kubenswrapper[4778]: E0318 09:05:28.197366 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.218794 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.234137 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.253632 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.273791 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.292133 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.310009 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.325642 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.343653 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.363432 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.380321 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.408861 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:26Z\\\",\\\"message\\\":\\\"Distribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0318 09:05:26.271548 7451 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-56rc7\\\\nF0318 09:05:26.271567 7451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z]\\\\nI0318 09:05:26.271448 7451 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-9f2bp\\\\nI0318 09:05:26.271585 7451 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-9f2bp in node crc\\\\nI0318 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:05:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.427474 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.451779 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.467879 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.483393 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.512826 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.525803 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.539043 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.553329 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:29 crc kubenswrapper[4778]: E0318 09:05:29.304795 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:30 crc kubenswrapper[4778]: I0318 09:05:30.186276 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:30 crc kubenswrapper[4778]: E0318 09:05:30.186472 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:30 crc kubenswrapper[4778]: I0318 09:05:30.186792 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:30 crc kubenswrapper[4778]: E0318 09:05:30.186884 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:30 crc kubenswrapper[4778]: I0318 09:05:30.187115 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:30 crc kubenswrapper[4778]: E0318 09:05:30.187268 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:30 crc kubenswrapper[4778]: I0318 09:05:30.187527 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:30 crc kubenswrapper[4778]: E0318 09:05:30.187647 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:32 crc kubenswrapper[4778]: I0318 09:05:32.187283 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:32 crc kubenswrapper[4778]: I0318 09:05:32.187315 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:32 crc kubenswrapper[4778]: E0318 09:05:32.187795 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:32 crc kubenswrapper[4778]: I0318 09:05:32.187372 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:32 crc kubenswrapper[4778]: I0318 09:05:32.187356 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:32 crc kubenswrapper[4778]: E0318 09:05:32.187986 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:32 crc kubenswrapper[4778]: E0318 09:05:32.188136 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:32 crc kubenswrapper[4778]: E0318 09:05:32.188338 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.186753 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.186825 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:34 crc kubenswrapper[4778]: E0318 09:05:34.187002 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.187037 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.187082 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:34 crc kubenswrapper[4778]: E0318 09:05:34.187236 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:34 crc kubenswrapper[4778]: E0318 09:05:34.187383 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:34 crc kubenswrapper[4778]: E0318 09:05:34.187594 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.208109 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.227288 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.244832 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.258961 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.284166 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: E0318 09:05:34.305460 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.309698 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.325417 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.343900 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.358829 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.377653 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.394718 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.410905 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.433300 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.447951 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.461234 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.479538 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.499327 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.513142 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.542238 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:26Z\\\",\\\"message\\\":\\\"Distribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0318 09:05:26.271548 7451 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-56rc7\\\\nF0318 09:05:26.271567 7451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z]\\\\nI0318 09:05:26.271448 7451 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-9f2bp\\\\nI0318 09:05:26.271585 7451 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-9f2bp in node crc\\\\nI0318 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:05:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.186789 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.186862 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.186991 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.187164 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.187437 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.187534 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.187729 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.187823 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.348542 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.348611 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.348635 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.348665 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.348689 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:36Z","lastTransitionTime":"2026-03-18T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.371705 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.375892 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.375932 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.375942 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.375959 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.375972 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:36Z","lastTransitionTime":"2026-03-18T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.392019 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.396722 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.396773 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.396791 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.396814 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.396829 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:36Z","lastTransitionTime":"2026-03-18T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.413896 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.418921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.419068 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.419145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.419266 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.419363 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:36Z","lastTransitionTime":"2026-03-18T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.434701 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.439518 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.439587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.439599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.439616 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.439630 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:36Z","lastTransitionTime":"2026-03-18T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.456721 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.456891 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.108593 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.108822 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:07:40.108782209 +0000 UTC m=+326.683527059 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.186480 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.186550 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.186618 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.186729 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.186780 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.186942 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.187098 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.187322 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.210555 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.210616 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.210652 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.210675 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210745 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210833 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210869 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210881 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210889 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210924 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:07:40.210886645 +0000 UTC m=+326.785631515 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210964 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:07:40.210941386 +0000 UTC m=+326.785686236 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210989 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.211050 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.211001 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:07:40.210987367 +0000 UTC m=+326.785732217 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.211076 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.211251 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:07:40.211161452 +0000 UTC m=+326.785906412 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:05:39 crc kubenswrapper[4778]: E0318 09:05:39.307142 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:40 crc kubenswrapper[4778]: I0318 09:05:40.186585 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:40 crc kubenswrapper[4778]: I0318 09:05:40.186662 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:40 crc kubenswrapper[4778]: I0318 09:05:40.186612 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:40 crc kubenswrapper[4778]: I0318 09:05:40.186840 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:40 crc kubenswrapper[4778]: E0318 09:05:40.186826 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:40 crc kubenswrapper[4778]: E0318 09:05:40.187044 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:40 crc kubenswrapper[4778]: E0318 09:05:40.187182 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:40 crc kubenswrapper[4778]: E0318 09:05:40.187278 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:42 crc kubenswrapper[4778]: I0318 09:05:42.186903 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:42 crc kubenswrapper[4778]: I0318 09:05:42.186924 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:42 crc kubenswrapper[4778]: I0318 09:05:42.187924 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:42 crc kubenswrapper[4778]: E0318 09:05:42.188004 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:42 crc kubenswrapper[4778]: I0318 09:05:42.188558 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:42 crc kubenswrapper[4778]: E0318 09:05:42.189264 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:42 crc kubenswrapper[4778]: E0318 09:05:42.189106 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:42 crc kubenswrapper[4778]: I0318 09:05:42.189111 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:05:42 crc kubenswrapper[4778]: E0318 09:05:42.189617 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:42 crc kubenswrapper[4778]: E0318 09:05:42.189659 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.186859 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.186921 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.186921 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:44 crc kubenswrapper[4778]: E0318 09:05:44.187133 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.187164 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:44 crc kubenswrapper[4778]: E0318 09:05:44.187341 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:44 crc kubenswrapper[4778]: E0318 09:05:44.187541 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:44 crc kubenswrapper[4778]: E0318 09:05:44.187685 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.204934 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.219873 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.239968 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.257629 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.278740 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.300460 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:26Z\\\",\\\"message\\\":\\\"Distribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0318 09:05:26.271548 7451 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-56rc7\\\\nF0318 09:05:26.271567 7451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z]\\\\nI0318 09:05:26.271448 7451 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-9f2bp\\\\nI0318 09:05:26.271585 7451 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-9f2bp in node crc\\\\nI0318 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:05:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: E0318 09:05:44.307752 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.325217 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.343389 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.358841 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.372375 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.405156 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.420359 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.433146 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.449524 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.461938 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.480671 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.497388 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.512316 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.528581 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.186572 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.186592 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.186689 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.186775 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.186993 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.187182 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.187421 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.187622 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.583234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.583298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.583322 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.583353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.583376 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:46Z","lastTransitionTime":"2026-03-18T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.603989 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:46Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.609760 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.609805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.609814 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.609828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.609839 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:46Z","lastTransitionTime":"2026-03-18T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.627609 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:46Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.632825 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.632895 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.632909 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.632934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.632951 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:46Z","lastTransitionTime":"2026-03-18T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.648474 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:46Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.652828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.652861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.652872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.652890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.652900 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:46Z","lastTransitionTime":"2026-03-18T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.669244 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:46Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.673914 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.673963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.673974 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.673991 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.674006 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:46Z","lastTransitionTime":"2026-03-18T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.686656 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:46Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.686821 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:05:48 crc kubenswrapper[4778]: I0318 09:05:48.187048 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:48 crc kubenswrapper[4778]: I0318 09:05:48.187048 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:48 crc kubenswrapper[4778]: E0318 09:05:48.187829 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:48 crc kubenswrapper[4778]: I0318 09:05:48.187493 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:48 crc kubenswrapper[4778]: I0318 09:05:48.187287 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:48 crc kubenswrapper[4778]: E0318 09:05:48.187927 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:48 crc kubenswrapper[4778]: E0318 09:05:48.188067 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:48 crc kubenswrapper[4778]: E0318 09:05:48.188157 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:49 crc kubenswrapper[4778]: E0318 09:05:49.309109 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:50 crc kubenswrapper[4778]: I0318 09:05:50.186541 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:50 crc kubenswrapper[4778]: I0318 09:05:50.186608 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:50 crc kubenswrapper[4778]: I0318 09:05:50.186515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:50 crc kubenswrapper[4778]: I0318 09:05:50.186545 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:50 crc kubenswrapper[4778]: E0318 09:05:50.186743 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:50 crc kubenswrapper[4778]: E0318 09:05:50.186825 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:50 crc kubenswrapper[4778]: E0318 09:05:50.186903 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:50 crc kubenswrapper[4778]: E0318 09:05:50.187176 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:52 crc kubenswrapper[4778]: I0318 09:05:52.186421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:52 crc kubenswrapper[4778]: I0318 09:05:52.186508 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:52 crc kubenswrapper[4778]: I0318 09:05:52.186570 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:52 crc kubenswrapper[4778]: I0318 09:05:52.186421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:52 crc kubenswrapper[4778]: E0318 09:05:52.186630 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:52 crc kubenswrapper[4778]: E0318 09:05:52.186791 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:52 crc kubenswrapper[4778]: E0318 09:05:52.186894 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:52 crc kubenswrapper[4778]: E0318 09:05:52.186997 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.186530 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.186676 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:54 crc kubenswrapper[4778]: E0318 09:05:54.186845 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.186890 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.186902 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:54 crc kubenswrapper[4778]: E0318 09:05:54.187115 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:54 crc kubenswrapper[4778]: E0318 09:05:54.187371 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:54 crc kubenswrapper[4778]: E0318 09:05:54.187543 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.256283 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=137.256250315 podStartE2EDuration="2m17.256250315s" podCreationTimestamp="2026-03-18 09:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.238495157 +0000 UTC m=+220.813240057" watchObservedRunningTime="2026-03-18 09:05:54.256250315 +0000 UTC m=+220.830995155" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.256660 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.256653647 podStartE2EDuration="1m11.256653647s" podCreationTimestamp="2026-03-18 09:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.255789263 +0000 UTC m=+220.830534123" watchObservedRunningTime="2026-03-18 09:05:54.256653647 +0000 UTC m=+220.831398487" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.278025 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=103.277991102 podStartE2EDuration="1m43.277991102s" podCreationTimestamp="2026-03-18 09:04:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.276566134 +0000 UTC m=+220.851311034" watchObservedRunningTime="2026-03-18 09:05:54.277991102 +0000 UTC m=+220.852735982" Mar 18 09:05:54 crc kubenswrapper[4778]: E0318 09:05:54.310233 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.340329 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podStartSLOduration=150.340303774 podStartE2EDuration="2m30.340303774s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.33978112 +0000 UTC m=+220.914525970" watchObservedRunningTime="2026-03-18 09:05:54.340303774 +0000 UTC m=+220.915048614" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.448641 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-r2lvf" podStartSLOduration=150.448613287 podStartE2EDuration="2m30.448613287s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.44833349 +0000 UTC m=+221.023078350" watchObservedRunningTime="2026-03-18 09:05:54.448613287 +0000 UTC m=+221.023358117" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.487901 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=65.487874287 podStartE2EDuration="1m5.487874287s" podCreationTimestamp="2026-03-18 09:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.487591609 +0000 UTC m=+221.062336469" watchObservedRunningTime="2026-03-18 09:05:54.487874287 +0000 UTC m=+221.062619127" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.510850 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=139.510823606 podStartE2EDuration="2m19.510823606s" podCreationTimestamp="2026-03-18 09:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.510575519 +0000 UTC m=+221.085320369" watchObservedRunningTime="2026-03-18 09:05:54.510823606 +0000 UTC m=+221.085568446" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.558340 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dfnnp" podStartSLOduration=150.558311607 podStartE2EDuration="2m30.558311607s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.536443347 +0000 UTC m=+221.111188207" watchObservedRunningTime="2026-03-18 09:05:54.558311607 +0000 UTC m=+221.133056497" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.570032 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" podStartSLOduration=150.570014863 podStartE2EDuration="2m30.570014863s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.556943851 +0000 UTC m=+221.131688701" watchObservedRunningTime="2026-03-18 09:05:54.570014863 +0000 UTC m=+221.144759703" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.583763 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9f2bp" podStartSLOduration=150.583736403 podStartE2EDuration="2m30.583736403s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.569650763 +0000 UTC m=+221.144395613" watchObservedRunningTime="2026-03-18 09:05:54.583736403 +0000 UTC m=+221.158481233" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.584292 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" podStartSLOduration=150.584285969 podStartE2EDuration="2m30.584285969s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.583667472 +0000 UTC m=+221.158412322" watchObservedRunningTime="2026-03-18 09:05:54.584285969 +0000 UTC m=+221.159030809" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.186790 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.186781 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.186898 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.186954 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:56 crc kubenswrapper[4778]: E0318 09:05:56.187062 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:56 crc kubenswrapper[4778]: E0318 09:05:56.187233 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:56 crc kubenswrapper[4778]: E0318 09:05:56.187712 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:56 crc kubenswrapper[4778]: E0318 09:05:56.187945 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.188134 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:05:56 crc kubenswrapper[4778]: E0318 09:05:56.188370 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.996470 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.996539 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.996559 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.996585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.996605 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:56Z","lastTransitionTime":"2026-03-18T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.055997 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6"] Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.056751 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.059751 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.059989 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.060076 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.061105 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.191478 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/19c1bc08-d789-4555-b6b7-6c162b9d8158-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.191594 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c1bc08-d789-4555-b6b7-6c162b9d8158-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.191883 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19c1bc08-d789-4555-b6b7-6c162b9d8158-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.191957 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19c1bc08-d789-4555-b6b7-6c162b9d8158-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.192056 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/19c1bc08-d789-4555-b6b7-6c162b9d8158-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.230390 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.241419 4778 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293725 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/19c1bc08-d789-4555-b6b7-6c162b9d8158-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293824 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c1bc08-d789-4555-b6b7-6c162b9d8158-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19c1bc08-d789-4555-b6b7-6c162b9d8158-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293876 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19c1bc08-d789-4555-b6b7-6c162b9d8158-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293910 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/19c1bc08-d789-4555-b6b7-6c162b9d8158-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293928 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/19c1bc08-d789-4555-b6b7-6c162b9d8158-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293977 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/19c1bc08-d789-4555-b6b7-6c162b9d8158-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.295361 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c1bc08-d789-4555-b6b7-6c162b9d8158-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.303773 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19c1bc08-d789-4555-b6b7-6c162b9d8158-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.324799 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19c1bc08-d789-4555-b6b7-6c162b9d8158-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.371442 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: W0318 09:05:57.390165 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19c1bc08_d789_4555_b6b7_6c162b9d8158.slice/crio-b0efafae09bcbd40d186b756b841a4bd6bae22bea62d8dcd3bf1bde537833265 WatchSource:0}: Error finding container b0efafae09bcbd40d186b756b841a4bd6bae22bea62d8dcd3bf1bde537833265: Status 404 returned error can't find the container with id b0efafae09bcbd40d186b756b841a4bd6bae22bea62d8dcd3bf1bde537833265 Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.187190 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.187359 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:58 crc kubenswrapper[4778]: E0318 09:05:58.187403 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.187462 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.187190 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:58 crc kubenswrapper[4778]: E0318 09:05:58.187604 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:58 crc kubenswrapper[4778]: E0318 09:05:58.187713 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:58 crc kubenswrapper[4778]: E0318 09:05:58.187787 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.306760 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" event={"ID":"19c1bc08-d789-4555-b6b7-6c162b9d8158","Type":"ContainerStarted","Data":"038fea5fb3390926dc22b9d4f252283c51a9262d5aa1d283947fe584b0a13ffc"} Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.306821 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" event={"ID":"19c1bc08-d789-4555-b6b7-6c162b9d8158","Type":"ContainerStarted","Data":"b0efafae09bcbd40d186b756b841a4bd6bae22bea62d8dcd3bf1bde537833265"} Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.330545 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" podStartSLOduration=154.330514799 podStartE2EDuration="2m34.330514799s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:58.330482118 +0000 UTC m=+224.905226968" watchObservedRunningTime="2026-03-18 09:05:58.330514799 +0000 UTC m=+224.905259639" Mar 18 09:05:59 crc kubenswrapper[4778]: E0318 09:05:59.311188 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.187081 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.187089 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.187153 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.187285 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:00 crc kubenswrapper[4778]: E0318 09:06:00.187505 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:00 crc kubenswrapper[4778]: E0318 09:06:00.187629 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:00 crc kubenswrapper[4778]: E0318 09:06:00.187814 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:00 crc kubenswrapper[4778]: E0318 09:06:00.187952 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.317094 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/1.log" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.318425 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/0.log" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.318507 4778 generic.go:334] "Generic (PLEG): container finished" podID="dce973f3-25e6-4536-87cc-9b46499ad7cf" containerID="3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562" exitCode=1 Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.318552 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerDied","Data":"3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562"} Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.318612 4778 scope.go:117] "RemoveContainer" containerID="27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.319340 4778 scope.go:117] "RemoveContainer" containerID="3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562" Mar 18 09:06:00 crc kubenswrapper[4778]: E0318 09:06:00.319669 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-r2lvf_openshift-multus(dce973f3-25e6-4536-87cc-9b46499ad7cf)\"" pod="openshift-multus/multus-r2lvf" podUID="dce973f3-25e6-4536-87cc-9b46499ad7cf" Mar 18 09:06:01 crc kubenswrapper[4778]: I0318 09:06:01.323994 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/1.log" Mar 18 09:06:02 crc kubenswrapper[4778]: I0318 09:06:02.186694 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:02 crc kubenswrapper[4778]: I0318 09:06:02.186848 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:02 crc kubenswrapper[4778]: I0318 09:06:02.186949 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:02 crc kubenswrapper[4778]: E0318 09:06:02.186943 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:02 crc kubenswrapper[4778]: I0318 09:06:02.187008 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:02 crc kubenswrapper[4778]: E0318 09:06:02.187079 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:02 crc kubenswrapper[4778]: E0318 09:06:02.187297 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:02 crc kubenswrapper[4778]: E0318 09:06:02.187408 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:04 crc kubenswrapper[4778]: I0318 09:06:04.187286 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:04 crc kubenswrapper[4778]: I0318 09:06:04.187304 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:04 crc kubenswrapper[4778]: I0318 09:06:04.187439 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:04 crc kubenswrapper[4778]: E0318 09:06:04.188514 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:04 crc kubenswrapper[4778]: I0318 09:06:04.188535 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:04 crc kubenswrapper[4778]: E0318 09:06:04.188677 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:04 crc kubenswrapper[4778]: E0318 09:06:04.188678 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:04 crc kubenswrapper[4778]: E0318 09:06:04.188747 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:04 crc kubenswrapper[4778]: E0318 09:06:04.312107 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:06:06 crc kubenswrapper[4778]: I0318 09:06:06.186607 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:06 crc kubenswrapper[4778]: E0318 09:06:06.186769 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:06 crc kubenswrapper[4778]: I0318 09:06:06.186894 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:06 crc kubenswrapper[4778]: I0318 09:06:06.187041 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:06 crc kubenswrapper[4778]: E0318 09:06:06.187124 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:06 crc kubenswrapper[4778]: I0318 09:06:06.187047 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:06 crc kubenswrapper[4778]: E0318 09:06:06.187231 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:06 crc kubenswrapper[4778]: E0318 09:06:06.187399 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.187044 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.187215 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:08 crc kubenswrapper[4778]: E0318 09:06:08.187328 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.187592 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.187620 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.188225 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:06:08 crc kubenswrapper[4778]: E0318 09:06:08.188586 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:08 crc kubenswrapper[4778]: E0318 09:06:08.188641 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:08 crc kubenswrapper[4778]: E0318 09:06:08.188812 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.351889 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/3.log" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.357077 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb"} Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.357611 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.392962 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podStartSLOduration=164.392934417 podStartE2EDuration="2m44.392934417s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:08.391641641 +0000 UTC m=+234.966386531" watchObservedRunningTime="2026-03-18 09:06:08.392934417 +0000 UTC m=+234.967679297" Mar 18 09:06:09 crc kubenswrapper[4778]: I0318 09:06:09.304992 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9bc7s"] Mar 18 09:06:09 crc kubenswrapper[4778]: I0318 09:06:09.305096 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:09 crc kubenswrapper[4778]: E0318 09:06:09.305215 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:09 crc kubenswrapper[4778]: E0318 09:06:09.313779 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:06:10 crc kubenswrapper[4778]: I0318 09:06:10.186780 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:10 crc kubenswrapper[4778]: E0318 09:06:10.187301 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:10 crc kubenswrapper[4778]: I0318 09:06:10.186907 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:10 crc kubenswrapper[4778]: E0318 09:06:10.187400 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:10 crc kubenswrapper[4778]: I0318 09:06:10.186852 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:10 crc kubenswrapper[4778]: E0318 09:06:10.187459 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:11 crc kubenswrapper[4778]: I0318 09:06:11.187158 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:11 crc kubenswrapper[4778]: E0318 09:06:11.187388 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:12 crc kubenswrapper[4778]: I0318 09:06:12.187010 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:12 crc kubenswrapper[4778]: I0318 09:06:12.187123 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:12 crc kubenswrapper[4778]: E0318 09:06:12.187801 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:12 crc kubenswrapper[4778]: E0318 09:06:12.187589 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:12 crc kubenswrapper[4778]: I0318 09:06:12.187166 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:12 crc kubenswrapper[4778]: E0318 09:06:12.187919 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:13 crc kubenswrapper[4778]: I0318 09:06:13.187165 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:13 crc kubenswrapper[4778]: I0318 09:06:13.187822 4778 scope.go:117] "RemoveContainer" containerID="3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562" Mar 18 09:06:13 crc kubenswrapper[4778]: E0318 09:06:13.188475 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:13 crc kubenswrapper[4778]: I0318 09:06:13.380754 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/1.log" Mar 18 09:06:13 crc kubenswrapper[4778]: I0318 09:06:13.380863 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerStarted","Data":"f3ed13c08ae99a4b04465da39cc132336b6c856e9f5e19d24c954fe658b51ed0"} Mar 18 09:06:14 crc kubenswrapper[4778]: I0318 09:06:14.186879 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:14 crc kubenswrapper[4778]: I0318 09:06:14.189369 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:14 crc kubenswrapper[4778]: I0318 09:06:14.189550 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:14 crc kubenswrapper[4778]: E0318 09:06:14.189651 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:14 crc kubenswrapper[4778]: E0318 09:06:14.190554 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:14 crc kubenswrapper[4778]: E0318 09:06:14.193035 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:14 crc kubenswrapper[4778]: E0318 09:06:14.314661 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:06:15 crc kubenswrapper[4778]: I0318 09:06:15.186138 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:15 crc kubenswrapper[4778]: E0318 09:06:15.186409 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:16 crc kubenswrapper[4778]: I0318 09:06:16.186640 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:16 crc kubenswrapper[4778]: I0318 09:06:16.186884 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:16 crc kubenswrapper[4778]: I0318 09:06:16.186905 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:16 crc kubenswrapper[4778]: E0318 09:06:16.187239 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:16 crc kubenswrapper[4778]: E0318 09:06:16.187064 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:16 crc kubenswrapper[4778]: E0318 09:06:16.187416 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:17 crc kubenswrapper[4778]: I0318 09:06:17.186277 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:17 crc kubenswrapper[4778]: E0318 09:06:17.186511 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:18 crc kubenswrapper[4778]: I0318 09:06:18.186228 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:18 crc kubenswrapper[4778]: I0318 09:06:18.186270 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:18 crc kubenswrapper[4778]: E0318 09:06:18.186493 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:18 crc kubenswrapper[4778]: E0318 09:06:18.186682 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:18 crc kubenswrapper[4778]: I0318 09:06:18.186973 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:18 crc kubenswrapper[4778]: E0318 09:06:18.187117 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:19 crc kubenswrapper[4778]: I0318 09:06:19.186725 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:19 crc kubenswrapper[4778]: E0318 09:06:19.187007 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.186519 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.186584 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.186839 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.190146 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.190693 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.191422 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.191531 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.664952 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:20 crc kubenswrapper[4778]: E0318 09:06:20.665135 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:06:20 crc kubenswrapper[4778]: E0318 09:06:20.665275 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:08:22.665184512 +0000 UTC m=+369.239929372 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:06:21 crc kubenswrapper[4778]: I0318 09:06:21.186402 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:21 crc kubenswrapper[4778]: I0318 09:06:21.189961 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 09:06:21 crc kubenswrapper[4778]: I0318 09:06:21.190648 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.253915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.299554 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.300283 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-pgsqh"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.300539 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.300731 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.301514 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.302139 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.303437 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.304468 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.305175 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bhmq7"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.305808 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.307178 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q7qs8"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.307726 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.315505 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.315800 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.316041 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.316348 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.316521 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.316380 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.320356 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.320526 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.320532 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.320594 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.321370 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.321403 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.322646 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.322795 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.323088 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.325185 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.326715 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.327251 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ckp9s"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.327463 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.327685 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.328704 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.329872 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.347246 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.347710 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.348684 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.366700 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.366985 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.369981 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.370294 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.370357 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.370582 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.370799 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.371024 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.371167 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.371525 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.372741 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.372994 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.373047 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nkbq4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.373171 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.373297 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.373641 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.374296 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.374460 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.374815 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.375377 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.377575 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.377633 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.377756 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.377901 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.377597 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.382133 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.382866 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.405679 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.405905 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.406031 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.406213 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.406314 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.408629 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.408903 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.408990 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.409372 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.410281 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.410528 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.410887 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.410992 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.411069 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.412022 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.414431 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mlr7l"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.414974 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvvlz"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.415297 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.415410 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.415680 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.415954 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.417168 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5dpv"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.418538 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.420884 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.422630 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.423083 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.423153 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.423229 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.423294 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.423372 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.423712 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.424086 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.424276 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.438556 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-nnfvg"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.439548 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.439969 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.440403 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.440579 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.440853 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.444756 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.445300 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.447757 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.471661 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.472064 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.472938 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.473603 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w2lf2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.473914 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.474937 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.475006 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.475101 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.475520 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.476084 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.476394 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.476616 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.477688 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.477894 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478176 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-smtz9"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478379 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0167d9e-5565-4154-80bb-3856d9b5985f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478414 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-image-import-ca\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478437 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-config\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478454 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-console-config\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478468 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7qw9\" (UniqueName: \"kubernetes.io/projected/5f875d21-ddf2-4d41-8be3-819c8836824a-kube-api-access-z7qw9\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478485 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdf835c-58e4-4297-a247-690f407af22d-serving-cert\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478510 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-tls\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478524 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-etcd-serving-ca\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478537 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-trusted-ca-bundle\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478563 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e93d5ac-22fb-4d53-86c4-3262993f2116-node-pullsecrets\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478578 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-serving-cert\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478594 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478618 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zpgn\" (UniqueName: \"kubernetes.io/projected/a2907797-7fb3-44c0-81cf-783512fd1bf6-kube-api-access-4zpgn\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478636 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-encryption-config\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478671 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-audit\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478684 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5flzf\" (UniqueName: \"kubernetes.io/projected/6e93d5ac-22fb-4d53-86c4-3262993f2116-kube-api-access-5flzf\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478700 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5h99\" (UniqueName: \"kubernetes.io/projected/6cdf835c-58e4-4297-a247-690f407af22d-kube-api-access-n5h99\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478716 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-client-ca\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478731 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e93d5ac-22fb-4d53-86c4-3262993f2116-audit-dir\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478746 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35cf99cc-0bae-4b8d-b861-103e3174f081-audit-dir\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478762 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-audit-policies\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478779 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-config\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478801 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478819 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478823 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478836 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-serving-cert\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478851 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llblr\" (UniqueName: \"kubernetes.io/projected/918ba01d-c786-4f9a-ae58-5bcc23684c16-kube-api-access-llblr\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478867 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-serving-cert\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478883 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-serving-cert\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478901 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-service-ca\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478917 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478941 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc9zp\" (UniqueName: \"kubernetes.io/projected/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-kube-api-access-hc9zp\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478957 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kd5b\" (UniqueName: \"kubernetes.io/projected/034ec244-f99c-4c50-a55a-9b33b8b376c3-kube-api-access-9kd5b\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478979 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p7td\" (UniqueName: \"kubernetes.io/projected/35cf99cc-0bae-4b8d-b861-103e3174f081-kube-api-access-5p7td\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478995 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-certificates\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479010 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2907797-7fb3-44c0-81cf-783512fd1bf6-serving-cert\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479027 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918ba01d-c786-4f9a-ae58-5bcc23684c16-config\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479040 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0167d9e-5565-4154-80bb-3856d9b5985f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479057 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479072 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-config\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479087 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-oauth-config\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479114 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0167d9e-5565-4154-80bb-3856d9b5985f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479129 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-trusted-ca\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479144 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034ec244-f99c-4c50-a55a-9b33b8b376c3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479158 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-etcd-client\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479176 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-service-ca-bundle\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479211 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-etcd-client\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479228 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6cdf835c-58e4-4297-a247-690f407af22d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479243 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-encryption-config\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479261 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-bound-sa-token\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479277 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsq4q\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-kube-api-access-fsq4q\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479293 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918ba01d-c786-4f9a-ae58-5bcc23684c16-serving-cert\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479325 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/918ba01d-c786-4f9a-ae58-5bcc23684c16-trusted-ca\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479343 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034ec244-f99c-4c50-a55a-9b33b8b376c3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479361 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-oauth-serving-cert\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478747 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4"] Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.479729 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:27.979717959 +0000 UTC m=+254.554462799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479967 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.480304 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.480412 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.480690 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.480769 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.481100 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.481108 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.481430 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.481789 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.481918 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.482432 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.482569 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.482611 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.482692 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.482774 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.482911 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.483111 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.483226 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.483296 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.483451 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484237 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484349 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484476 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484636 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484814 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484828 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484930 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.485327 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gkpf4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.485659 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.486000 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.486589 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.486931 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hr48"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.487260 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-tnw27"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.487523 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.487580 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.487732 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.487763 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.488296 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.488420 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.490327 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.491538 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.491267 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.492274 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-57msj"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.492665 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.493039 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.493226 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.493679 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.496355 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563744-btdt7"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.497005 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6frtc"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.497932 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.498119 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.499310 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.499584 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.500389 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.503061 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.503328 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pgsqh"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.509514 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563746-b66f7"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.511988 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.521532 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.521907 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.523972 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.523996 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.524007 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bhmq7"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.524056 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.524071 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.526243 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.528105 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.528693 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wgbcp"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.531544 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5gdpq"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.531800 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.532990 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.533061 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.533704 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvvlz"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.535389 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.537081 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.538307 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.539533 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.540945 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.542318 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.542653 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.544037 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gkpf4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.545470 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nkbq4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.547061 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.548671 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.550148 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ckp9s"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.551496 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qqzxx"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.552382 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.553190 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jmmm2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.554045 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.554899 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tnw27"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.556775 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5dpv"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.558037 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q7qs8"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.559271 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563744-btdt7"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.560226 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hr48"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.561247 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.562236 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jmmm2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.562669 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.563418 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mlr7l"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.564469 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-smtz9"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.565870 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.567299 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.568641 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w2lf2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.569736 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.570783 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.571888 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-57msj"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.572930 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6frtc"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.574038 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5gdpq"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.575173 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.576041 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.578181 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563746-b66f7"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.580266 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.580627 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc9zp\" (UniqueName: \"kubernetes.io/projected/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-kube-api-access-hc9zp\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.580678 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kd5b\" (UniqueName: \"kubernetes.io/projected/034ec244-f99c-4c50-a55a-9b33b8b376c3-kube-api-access-9kd5b\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.580710 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p7td\" (UniqueName: \"kubernetes.io/projected/35cf99cc-0bae-4b8d-b861-103e3174f081-kube-api-access-5p7td\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.580736 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.080706124 +0000 UTC m=+254.655450964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.580805 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-certificates\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.581022 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2907797-7fb3-44c0-81cf-783512fd1bf6-serving-cert\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.581233 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918ba01d-c786-4f9a-ae58-5bcc23684c16-config\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.581273 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0167d9e-5565-4154-80bb-3856d9b5985f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.582090 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-certificates\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.582505 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.582571 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgbcp"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.582570 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918ba01d-c786-4f9a-ae58-5bcc23684c16-config\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.582672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-config\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.583010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.583173 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-oauth-config\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.584039 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-config\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.584541 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.584770 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.584954 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0167d9e-5565-4154-80bb-3856d9b5985f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.585104 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-etcd-client\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.585233 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-trusted-ca\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.585352 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034ec244-f99c-4c50-a55a-9b33b8b376c3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.586979 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-service-ca-bundle\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.587502 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-etcd-client\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.587573 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6cdf835c-58e4-4297-a247-690f407af22d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.587611 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-encryption-config\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.588575 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6cdf835c-58e4-4297-a247-690f407af22d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.588662 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918ba01d-c786-4f9a-ae58-5bcc23684c16-serving-cert\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.588728 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/918ba01d-c786-4f9a-ae58-5bcc23684c16-trusted-ca\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.588786 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-bound-sa-token\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.588831 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsq4q\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-kube-api-access-fsq4q\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.589175 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.589258 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034ec244-f99c-4c50-a55a-9b33b8b376c3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.589312 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-oauth-serving-cert\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.589377 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0167d9e-5565-4154-80bb-3856d9b5985f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.589631 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.590948 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.591634 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-oauth-config\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592086 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-etcd-client\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.591965 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/918ba01d-c786-4f9a-ae58-5bcc23684c16-trusted-ca\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592389 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2907797-7fb3-44c0-81cf-783512fd1bf6-serving-cert\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592469 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-image-import-ca\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592725 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-trusted-ca\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592864 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdf835c-58e4-4297-a247-690f407af22d-serving-cert\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592916 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-config\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592955 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-console-config\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593131 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0167d9e-5565-4154-80bb-3856d9b5985f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593268 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034ec244-f99c-4c50-a55a-9b33b8b376c3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593278 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7qw9\" (UniqueName: \"kubernetes.io/projected/5f875d21-ddf2-4d41-8be3-819c8836824a-kube-api-access-z7qw9\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593321 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-trusted-ca-bundle\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593609 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-image-import-ca\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593902 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918ba01d-c786-4f9a-ae58-5bcc23684c16-serving-cert\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0167d9e-5565-4154-80bb-3856d9b5985f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594301 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-console-config\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594364 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-tls\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-etcd-serving-ca\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594439 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-service-ca-bundle\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594463 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e93d5ac-22fb-4d53-86c4-3262993f2116-node-pullsecrets\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594492 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-serving-cert\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594790 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-config\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.595132 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-etcd-serving-ca\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.595270 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-etcd-client\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.595747 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-oauth-serving-cert\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596041 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034ec244-f99c-4c50-a55a-9b33b8b376c3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596132 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-trusted-ca-bundle\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596262 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e93d5ac-22fb-4d53-86c4-3262993f2116-node-pullsecrets\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596390 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596558 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zpgn\" (UniqueName: \"kubernetes.io/projected/a2907797-7fb3-44c0-81cf-783512fd1bf6-kube-api-access-4zpgn\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596644 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-encryption-config\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596699 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5h99\" (UniqueName: \"kubernetes.io/projected/6cdf835c-58e4-4297-a247-690f407af22d-kube-api-access-n5h99\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596791 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-audit\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596907 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5flzf\" (UniqueName: \"kubernetes.io/projected/6e93d5ac-22fb-4d53-86c4-3262993f2116-kube-api-access-5flzf\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596963 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-client-ca\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597054 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e93d5ac-22fb-4d53-86c4-3262993f2116-audit-dir\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597098 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35cf99cc-0bae-4b8d-b861-103e3174f081-audit-dir\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597172 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-audit-policies\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597268 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-config\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597347 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597400 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597435 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-serving-cert\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597490 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llblr\" (UniqueName: \"kubernetes.io/projected/918ba01d-c786-4f9a-ae58-5bcc23684c16-kube-api-access-llblr\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597646 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-serving-cert\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597779 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-serving-cert\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597830 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-encryption-config\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597852 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-service-ca\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597917 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597936 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.598146 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-client-ca\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.598772 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-tls\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.599082 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.098971598 +0000 UTC m=+254.673716458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.599095 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35cf99cc-0bae-4b8d-b861-103e3174f081-audit-dir\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.599450 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdf835c-58e4-4297-a247-690f407af22d-serving-cert\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.599476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-audit\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.599520 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e93d5ac-22fb-4d53-86c4-3262993f2116-audit-dir\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.599802 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.599986 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-service-ca\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.600184 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-config\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.600748 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-audit-policies\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.600961 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.601961 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-serving-cert\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.602087 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-serving-cert\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.602820 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.604290 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-encryption-config\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.604494 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.604967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-serving-cert\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.605656 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-serving-cert\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.623734 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.653892 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.663350 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.682810 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698442 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.698611 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.198576475 +0000 UTC m=+254.773321315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698671 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55w8z\" (UniqueName: \"kubernetes.io/projected/c3be356e-94af-47db-a182-dd8a57024619-kube-api-access-55w8z\") pod \"auto-csr-approver-29563746-b66f7\" (UID: \"c3be356e-94af-47db-a182-dd8a57024619\") " pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698718 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmtc\" (UniqueName: \"kubernetes.io/projected/6d6ab3a6-da16-4fc8-9235-2c223661de30-kube-api-access-swmtc\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698746 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c851677f-703c-404c-801c-064cc6bf3979-proxy-tls\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698773 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698837 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-config\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698868 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-ca\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698909 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef74c17c-eb2a-4bef-b948-6b06efd76719-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698954 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46167450-7100-4ac9-a9dd-e678eb3d8677-serving-cert\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698975 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df4jz\" (UniqueName: \"kubernetes.io/projected/496a64ab-b670-4201-9238-d60415ccba17-kube-api-access-df4jz\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698993 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ee6937-a1a5-42ea-a460-29d54478e633-secret-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699010 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d4d183f7-2762-458d-83f1-a8894c00bb82-srv-cert\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699031 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjwps\" (UniqueName: \"kubernetes.io/projected/54961f10-93b0-433f-8a7d-b30d69178e9a-kube-api-access-zjwps\") pod \"auto-csr-approver-29563744-btdt7\" (UID: \"54961f10-93b0-433f-8a7d-b30d69178e9a\") " pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699050 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-client-ca\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699067 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f3c490-ee49-4a88-893e-132592dd6d59-config\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699096 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztqc4\" (UniqueName: \"kubernetes.io/projected/f06790e0-cf8c-48f0-8d48-893663fdbd1c-kube-api-access-ztqc4\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7467j\" (UniqueName: \"kubernetes.io/projected/0a99ad6c-7819-4b33-8846-26e6ede5ce22-kube-api-access-7467j\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699138 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699158 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699176 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5gzz\" (UniqueName: \"kubernetes.io/projected/46167450-7100-4ac9-a9dd-e678eb3d8677-kube-api-access-b5gzz\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699207 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-config\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699224 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/80d2d01c-2b8d-49ff-adad-6b49568293a0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gkpf4\" (UID: \"80d2d01c-2b8d-49ff-adad-6b49568293a0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699243 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-default-certificate\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699259 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-webhook-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699276 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699293 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c0fd619-d1c2-45e7-a7cf-e784b082428f-trusted-ca\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496a64ab-b670-4201-9238-d60415ccba17-serving-cert\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699329 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699348 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699392 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b244100-6e88-4ba2-b656-83b6e31d23c8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699451 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxpv5\" (UniqueName: \"kubernetes.io/projected/db81860d-bcb7-4a56-a935-544dbc4be29b-kube-api-access-jxpv5\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699486 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-service-ca-bundle\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699504 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f3c490-ee49-4a88-893e-132592dd6d59-serving-cert\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699523 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699546 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c0fd619-d1c2-45e7-a7cf-e784b082428f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699583 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f06790e0-cf8c-48f0-8d48-893663fdbd1c-images\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699606 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-policies\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699635 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r5f7\" (UniqueName: \"kubernetes.io/projected/80d2d01c-2b8d-49ff-adad-6b49568293a0-kube-api-access-8r5f7\") pod \"multus-admission-controller-857f4d67dd-gkpf4\" (UID: \"80d2d01c-2b8d-49ff-adad-6b49568293a0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699659 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699682 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-dir\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699700 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qtbr\" (UniqueName: \"kubernetes.io/projected/ba84f396-0169-4d5e-a126-60ac9d6d49f8-kube-api-access-7qtbr\") pod \"control-plane-machine-set-operator-78cbb6b69f-qtggn\" (UID: \"ba84f396-0169-4d5e-a126-60ac9d6d49f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699716 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d4d183f7-2762-458d-83f1-a8894c00bb82-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699741 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699760 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-machine-approver-tls\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699815 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-auth-proxy-config\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700111 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700186 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06790e0-cf8c-48f0-8d48-893663fdbd1c-config\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700241 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjfwr\" (UniqueName: \"kubernetes.io/projected/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-kube-api-access-xjfwr\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700353 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b244100-6e88-4ba2-b656-83b6e31d23c8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700402 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-cert\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700453 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-config\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700484 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq5ms\" (UniqueName: \"kubernetes.io/projected/d4d183f7-2762-458d-83f1-a8894c00bb82-kube-api-access-sq5ms\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700548 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6d6ab3a6-da16-4fc8-9235-2c223661de30-tmpfs\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700589 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-metrics-certs\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700617 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92ctw\" (UniqueName: \"kubernetes.io/projected/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-kube-api-access-92ctw\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700650 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-client\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700704 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700766 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-service-ca\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700799 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-key\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700829 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fgmb\" (UniqueName: \"kubernetes.io/projected/ef74c17c-eb2a-4bef-b948-6b06efd76719-kube-api-access-2fgmb\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700871 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njhmj\" (UniqueName: \"kubernetes.io/projected/6c0fd619-d1c2-45e7-a7cf-e784b082428f-kube-api-access-njhmj\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700901 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcvjk\" (UniqueName: \"kubernetes.io/projected/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-kube-api-access-xcvjk\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700929 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5n2d\" (UniqueName: \"kubernetes.io/projected/08b964cf-bfc5-4b90-83a3-0b358c3ffbc9-kube-api-access-p5n2d\") pod \"downloads-7954f5f757-tnw27\" (UID: \"08b964cf-bfc5-4b90-83a3-0b358c3ffbc9\") " pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700978 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c0fd619-d1c2-45e7-a7cf-e784b082428f-metrics-tls\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701014 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c851677f-703c-404c-801c-064cc6bf3979-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701037 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701057 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-cabundle\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701079 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701102 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcgvq\" (UniqueName: \"kubernetes.io/projected/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-kube-api-access-jcgvq\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701143 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701171 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba84f396-0169-4d5e-a126-60ac9d6d49f8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qtggn\" (UID: \"ba84f396-0169-4d5e-a126-60ac9d6d49f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701213 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-stats-auth\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701240 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701260 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9n22\" (UniqueName: \"kubernetes.io/projected/97ee6937-a1a5-42ea-a460-29d54478e633-kube-api-access-h9n22\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701280 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f06790e0-cf8c-48f0-8d48-893663fdbd1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701300 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjbvm\" (UniqueName: \"kubernetes.io/projected/0b244100-6e88-4ba2-b656-83b6e31d23c8-kube-api-access-vjbvm\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701329 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f5nq\" (UniqueName: \"kubernetes.io/projected/c0f3c490-ee49-4a88-893e-132592dd6d59-kube-api-access-7f5nq\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701352 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef74c17c-eb2a-4bef-b948-6b06efd76719-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701372 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701394 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt67h\" (UniqueName: \"kubernetes.io/projected/e24d15f2-56e5-4fcc-91ab-370d7b4fb41e-kube-api-access-dt67h\") pod \"migrator-59844c95c7-jbw52\" (UID: \"e24d15f2-56e5-4fcc-91ab-370d7b4fb41e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701411 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-apiservice-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701429 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b244100-6e88-4ba2-b656-83b6e31d23c8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701450 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4699k\" (UniqueName: \"kubernetes.io/projected/c851677f-703c-404c-801c-064cc6bf3979-kube-api-access-4699k\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.701957 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.201941736 +0000 UTC m=+254.776686566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.703878 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.723112 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.743114 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.763582 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.783657 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.801993 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.802266 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.302186261 +0000 UTC m=+254.876931101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802339 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztqc4\" (UniqueName: \"kubernetes.io/projected/f06790e0-cf8c-48f0-8d48-893663fdbd1c-kube-api-access-ztqc4\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802408 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7467j\" (UniqueName: \"kubernetes.io/projected/0a99ad6c-7819-4b33-8846-26e6ede5ce22-kube-api-access-7467j\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802476 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802518 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5gzz\" (UniqueName: \"kubernetes.io/projected/46167450-7100-4ac9-a9dd-e678eb3d8677-kube-api-access-b5gzz\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802548 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-config\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802617 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/80d2d01c-2b8d-49ff-adad-6b49568293a0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gkpf4\" (UID: \"80d2d01c-2b8d-49ff-adad-6b49568293a0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802654 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-default-certificate\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802685 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-webhook-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802717 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802752 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c0fd619-d1c2-45e7-a7cf-e784b082428f-trusted-ca\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802783 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802815 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496a64ab-b670-4201-9238-d60415ccba17-serving-cert\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802848 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802878 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b244100-6e88-4ba2-b656-83b6e31d23c8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802921 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f3c490-ee49-4a88-893e-132592dd6d59-serving-cert\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802950 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxpv5\" (UniqueName: \"kubernetes.io/projected/db81860d-bcb7-4a56-a935-544dbc4be29b-kube-api-access-jxpv5\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802979 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-service-ca-bundle\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803011 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803037 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c0fd619-d1c2-45e7-a7cf-e784b082428f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803074 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f06790e0-cf8c-48f0-8d48-893663fdbd1c-images\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-policies\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803181 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r5f7\" (UniqueName: \"kubernetes.io/projected/80d2d01c-2b8d-49ff-adad-6b49568293a0-kube-api-access-8r5f7\") pod \"multus-admission-controller-857f4d67dd-gkpf4\" (UID: \"80d2d01c-2b8d-49ff-adad-6b49568293a0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803258 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803271 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803284 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803325 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-dir\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803363 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qtbr\" (UniqueName: \"kubernetes.io/projected/ba84f396-0169-4d5e-a126-60ac9d6d49f8-kube-api-access-7qtbr\") pod \"control-plane-machine-set-operator-78cbb6b69f-qtggn\" (UID: \"ba84f396-0169-4d5e-a126-60ac9d6d49f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803395 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d4d183f7-2762-458d-83f1-a8894c00bb82-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803436 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803465 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-machine-approver-tls\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803494 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803525 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-auth-proxy-config\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803564 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06790e0-cf8c-48f0-8d48-893663fdbd1c-config\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803588 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjfwr\" (UniqueName: \"kubernetes.io/projected/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-kube-api-access-xjfwr\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803625 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-cert\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803653 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b244100-6e88-4ba2-b656-83b6e31d23c8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803689 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-config\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803716 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq5ms\" (UniqueName: \"kubernetes.io/projected/d4d183f7-2762-458d-83f1-a8894c00bb82-kube-api-access-sq5ms\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803755 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92ctw\" (UniqueName: \"kubernetes.io/projected/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-kube-api-access-92ctw\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803782 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6d6ab3a6-da16-4fc8-9235-2c223661de30-tmpfs\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803808 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-metrics-certs\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803845 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-client\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803869 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-service-ca\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803983 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-key\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804017 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804032 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fgmb\" (UniqueName: \"kubernetes.io/projected/ef74c17c-eb2a-4bef-b948-6b06efd76719-kube-api-access-2fgmb\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804074 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njhmj\" (UniqueName: \"kubernetes.io/projected/6c0fd619-d1c2-45e7-a7cf-e784b082428f-kube-api-access-njhmj\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804109 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5n2d\" (UniqueName: \"kubernetes.io/projected/08b964cf-bfc5-4b90-83a3-0b358c3ffbc9-kube-api-access-p5n2d\") pod \"downloads-7954f5f757-tnw27\" (UID: \"08b964cf-bfc5-4b90-83a3-0b358c3ffbc9\") " pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804134 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcvjk\" (UniqueName: \"kubernetes.io/projected/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-kube-api-access-xcvjk\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804188 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c0fd619-d1c2-45e7-a7cf-e784b082428f-metrics-tls\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804253 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c851677f-703c-404c-801c-064cc6bf3979-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804283 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804330 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-cabundle\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804362 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804391 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804418 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcgvq\" (UniqueName: \"kubernetes.io/projected/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-kube-api-access-jcgvq\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804455 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-stats-auth\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804491 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba84f396-0169-4d5e-a126-60ac9d6d49f8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qtggn\" (UID: \"ba84f396-0169-4d5e-a126-60ac9d6d49f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804530 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f06790e0-cf8c-48f0-8d48-893663fdbd1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804553 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbvm\" (UniqueName: \"kubernetes.io/projected/0b244100-6e88-4ba2-b656-83b6e31d23c8-kube-api-access-vjbvm\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804579 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804599 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b244100-6e88-4ba2-b656-83b6e31d23c8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804604 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9n22\" (UniqueName: \"kubernetes.io/projected/97ee6937-a1a5-42ea-a460-29d54478e633-kube-api-access-h9n22\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804640 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804679 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f5nq\" (UniqueName: \"kubernetes.io/projected/c0f3c490-ee49-4a88-893e-132592dd6d59-kube-api-access-7f5nq\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804710 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef74c17c-eb2a-4bef-b948-6b06efd76719-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804739 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt67h\" (UniqueName: \"kubernetes.io/projected/e24d15f2-56e5-4fcc-91ab-370d7b4fb41e-kube-api-access-dt67h\") pod \"migrator-59844c95c7-jbw52\" (UID: \"e24d15f2-56e5-4fcc-91ab-370d7b4fb41e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804784 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4699k\" (UniqueName: \"kubernetes.io/projected/c851677f-703c-404c-801c-064cc6bf3979-kube-api-access-4699k\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804801 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-apiservice-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804817 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b244100-6e88-4ba2-b656-83b6e31d23c8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804837 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55w8z\" (UniqueName: \"kubernetes.io/projected/c3be356e-94af-47db-a182-dd8a57024619-kube-api-access-55w8z\") pod \"auto-csr-approver-29563746-b66f7\" (UID: \"c3be356e-94af-47db-a182-dd8a57024619\") " pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804864 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swmtc\" (UniqueName: \"kubernetes.io/projected/6d6ab3a6-da16-4fc8-9235-2c223661de30-kube-api-access-swmtc\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804885 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-ca\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804904 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c851677f-703c-404c-801c-064cc6bf3979-proxy-tls\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804921 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804939 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-config\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804960 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef74c17c-eb2a-4bef-b948-6b06efd76719-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804979 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46167450-7100-4ac9-a9dd-e678eb3d8677-serving-cert\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805007 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df4jz\" (UniqueName: \"kubernetes.io/projected/496a64ab-b670-4201-9238-d60415ccba17-kube-api-access-df4jz\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805039 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ee6937-a1a5-42ea-a460-29d54478e633-secret-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805066 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjwps\" (UniqueName: \"kubernetes.io/projected/54961f10-93b0-433f-8a7d-b30d69178e9a-kube-api-access-zjwps\") pod \"auto-csr-approver-29563744-btdt7\" (UID: \"54961f10-93b0-433f-8a7d-b30d69178e9a\") " pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d4d183f7-2762-458d-83f1-a8894c00bb82-srv-cert\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805126 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-client-ca\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f3c490-ee49-4a88-893e-132592dd6d59-config\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805642 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c851677f-703c-404c-801c-064cc6bf3979-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805857 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-policies\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.806042 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.306030746 +0000 UTC m=+254.880775826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.806049 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef74c17c-eb2a-4bef-b948-6b06efd76719-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.806372 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-service-ca-bundle\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.806964 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-ca\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.807030 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.807081 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-dir\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.807144 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.808097 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-config\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.809239 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-client-ca\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.809779 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6d6ab3a6-da16-4fc8-9235-2c223661de30-tmpfs\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.809870 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.809913 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-service-ca\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.810699 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496a64ab-b670-4201-9238-d60415ccba17-serving-cert\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.811337 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.812970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-config\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.813368 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-client\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.813556 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.813598 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.813782 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.815335 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.815757 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-default-certificate\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.816054 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba84f396-0169-4d5e-a126-60ac9d6d49f8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qtggn\" (UID: \"ba84f396-0169-4d5e-a126-60ac9d6d49f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.816487 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.816975 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-metrics-certs\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.817334 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-stats-auth\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.818385 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c851677f-703c-404c-801c-064cc6bf3979-proxy-tls\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.818862 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef74c17c-eb2a-4bef-b948-6b06efd76719-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.823617 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.826558 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46167450-7100-4ac9-a9dd-e678eb3d8677-serving-cert\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.845037 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.865314 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.867678 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f06790e0-cf8c-48f0-8d48-893663fdbd1c-images\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.884896 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.902518 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.906187 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.906360 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.406328862 +0000 UTC m=+254.981073742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.907006 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.907416 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.407405311 +0000 UTC m=+254.982150171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.909909 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f06790e0-cf8c-48f0-8d48-893663fdbd1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.922327 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.930377 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06790e0-cf8c-48f0-8d48-893663fdbd1c-config\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.943666 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.963467 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.984589 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.988957 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c0fd619-d1c2-45e7-a7cf-e784b082428f-metrics-tls\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.004074 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.008439 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.008681 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.508616093 +0000 UTC m=+255.083360973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.009271 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.009836 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.509797514 +0000 UTC m=+255.084542354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.033691 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.043602 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.045902 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c0fd619-d1c2-45e7-a7cf-e784b082428f-trusted-ca\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.063503 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.083151 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.103437 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.110589 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.111038 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.611002455 +0000 UTC m=+255.185747305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.111384 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.112091 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.612050064 +0000 UTC m=+255.186795094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.123123 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.143161 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.163903 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.183765 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.202863 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.212852 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.213183 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.712992918 +0000 UTC m=+255.287737798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.214152 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.214685 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.714665233 +0000 UTC m=+255.289410113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.225277 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.243897 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.265591 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.276549 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-machine-approver-tls\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.283757 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.303429 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.316177 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.316454 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.816375688 +0000 UTC m=+255.391120578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.317129 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.317863 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.817836067 +0000 UTC m=+255.392580947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.323175 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.343276 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.350654 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-auth-proxy-config\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.362900 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.370375 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-config\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.383920 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.404116 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.418865 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.419094 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.919061559 +0000 UTC m=+255.493806439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.420481 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.421113 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.921086284 +0000 UTC m=+255.495831164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.423375 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.443333 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.463813 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.482761 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.501886 4778 request.go:700] Waited for 1.014673143s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dconfig&limit=500&resourceVersion=0 Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.503774 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.522836 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.523129 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.023083427 +0000 UTC m=+255.597828317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.523541 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.524137 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.524231 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.024214677 +0000 UTC m=+255.598959517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.543706 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.551668 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b244100-6e88-4ba2-b656-83b6e31d23c8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.564515 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.583633 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.604417 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.623817 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.624381 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.624537 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.124509354 +0000 UTC m=+255.699254204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.624811 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.625171 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.125160221 +0000 UTC m=+255.699905071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.632564 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/80d2d01c-2b8d-49ff-adad-6b49568293a0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gkpf4\" (UID: \"80d2d01c-2b8d-49ff-adad-6b49568293a0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.644253 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.663010 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.682908 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.690847 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d4d183f7-2762-458d-83f1-a8894c00bb82-srv-cert\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.703340 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.710606 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ee6937-a1a5-42ea-a460-29d54478e633-secret-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.710957 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d4d183f7-2762-458d-83f1-a8894c00bb82-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.723384 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.725821 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.726003 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.225964502 +0000 UTC m=+255.800709352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.726124 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.726557 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.226538797 +0000 UTC m=+255.801283637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.743469 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.763708 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.782848 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.803546 4778 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.803657 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0f3c490-ee49-4a88-893e-132592dd6d59-serving-cert podName:c0f3c490-ee49-4a88-893e-132592dd6d59 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.303625228 +0000 UTC m=+255.878370078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c0f3c490-ee49-4a88-893e-132592dd6d59-serving-cert") pod "service-ca-operator-777779d784-57msj" (UID: "c0f3c490-ee49-4a88-893e-132592dd6d59") : failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.803985 4778 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.804037 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca podName:f25fe9ee-95f7-4a7c-98f1-7dabbd43527a nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.304022489 +0000 UTC m=+255.878767339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca") pod "marketplace-operator-79b997595-2hr48" (UID: "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a") : failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.804142 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806105 4778 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806167 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-cabundle podName:0a99ad6c-7819-4b33-8846-26e6ede5ce22 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.306153526 +0000 UTC m=+255.880898376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-cabundle") pod "service-ca-9c57cc56f-6frtc" (UID: "0a99ad6c-7819-4b33-8846-26e6ede5ce22") : failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806400 4778 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806544 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume podName:97ee6937-a1a5-42ea-a460-29d54478e633 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.306501196 +0000 UTC m=+255.881246136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume") pod "collect-profiles-29563740-wbvxl" (UID: "97ee6937-a1a5-42ea-a460-29d54478e633") : failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806626 4778 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806707 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c0f3c490-ee49-4a88-893e-132592dd6d59-config podName:c0f3c490-ee49-4a88-893e-132592dd6d59 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.306686551 +0000 UTC m=+255.881431681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c0f3c490-ee49-4a88-893e-132592dd6d59-config") pod "service-ca-operator-777779d784-57msj" (UID: "c0f3c490-ee49-4a88-893e-132592dd6d59") : failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806773 4778 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806866 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-apiservice-cert podName:6d6ab3a6-da16-4fc8-9235-2c223661de30 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.306846405 +0000 UTC m=+255.881591275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-apiservice-cert") pod "packageserver-d55dfcdfc-5vjr4" (UID: "6d6ab3a6-da16-4fc8-9235-2c223661de30") : failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.808029 4778 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.808112 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-webhook-cert podName:6d6ab3a6-da16-4fc8-9235-2c223661de30 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.308092548 +0000 UTC m=+255.882837418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-webhook-cert") pod "packageserver-d55dfcdfc-5vjr4" (UID: "6d6ab3a6-da16-4fc8-9235-2c223661de30") : failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.809418 4778 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.809483 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-cert podName:bfcc4e0d-0910-42c0-bcac-44c6aee8b74d nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.309469666 +0000 UTC m=+255.884214506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-cert") pod "ingress-canary-jmmm2" (UID: "bfcc4e0d-0910-42c0-bcac-44c6aee8b74d") : failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.810171 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.811656 4778 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.811743 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-key podName:0a99ad6c-7819-4b33-8846-26e6ede5ce22 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.311722297 +0000 UTC m=+255.886467347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-key") pod "service-ca-9c57cc56f-6frtc" (UID: "0a99ad6c-7819-4b33-8846-26e6ede5ce22") : failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.829008 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.829319 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.32927573 +0000 UTC m=+255.904020600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.830064 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.830545 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.330524493 +0000 UTC m=+255.905269343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.836893 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.843835 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.864571 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.882827 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.903718 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.924085 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.931826 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.932086 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.432054094 +0000 UTC m=+256.006798974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.932885 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.933286 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.433276547 +0000 UTC m=+256.008021387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.944684 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.963872 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.983955 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.003687 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.023776 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.034160 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.034415 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.534361615 +0000 UTC m=+256.109106505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.034966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.035602 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.535581647 +0000 UTC m=+256.110326557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.044316 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.063360 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.083027 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.104101 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.123656 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.136877 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.137135 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.637099718 +0000 UTC m=+256.211844588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.138116 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.138673 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.638648789 +0000 UTC m=+256.213393669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.144245 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.185139 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.203710 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.224441 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.239774 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.240093 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.740047135 +0000 UTC m=+256.314792025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.241171 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.241704 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.74168055 +0000 UTC m=+256.316425430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.244736 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.264622 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.284394 4778 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.303624 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.322971 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.342058 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.342248 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.842190413 +0000 UTC m=+256.416935273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.342372 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-key\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.342685 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-cabundle\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.343483 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.343887 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.843875128 +0000 UTC m=+256.418619978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.343900 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.343907 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-cabundle\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.344758 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.344868 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-apiservice-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.345375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f3c490-ee49-4a88-893e-132592dd6d59-config\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.345608 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.345664 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-webhook-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.345738 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f3c490-ee49-4a88-893e-132592dd6d59-serving-cert\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.345999 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.346057 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-cert\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.348584 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.348704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f3c490-ee49-4a88-893e-132592dd6d59-config\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.350781 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-apiservice-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.357263 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f3c490-ee49-4a88-893e-132592dd6d59-serving-cert\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.359558 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-webhook-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.361524 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-key\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.364062 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.384144 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.404263 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.412715 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-cert\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.423655 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.443584 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.446996 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.447190 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.947156135 +0000 UTC m=+256.521900985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.447327 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.447701 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.94768606 +0000 UTC m=+256.522430910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.492061 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p7td\" (UniqueName: \"kubernetes.io/projected/35cf99cc-0bae-4b8d-b861-103e3174f081-kube-api-access-5p7td\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.501682 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc9zp\" (UniqueName: \"kubernetes.io/projected/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-kube-api-access-hc9zp\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.520746 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kd5b\" (UniqueName: \"kubernetes.io/projected/034ec244-f99c-4c50-a55a-9b33b8b376c3-kube-api-access-9kd5b\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.522051 4778 request.go:700] Waited for 1.934565618s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/serviceaccounts/openshift-kube-scheduler-operator/token Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.549091 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.549347 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.049305502 +0000 UTC m=+256.624050382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.549849 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.550615 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.050588446 +0000 UTC m=+256.625333396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.553664 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0167d9e-5565-4154-80bb-3856d9b5985f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.560783 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-bound-sa-token\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.580028 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsq4q\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-kube-api-access-fsq4q\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.606554 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.611867 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.616106 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7qw9\" (UniqueName: \"kubernetes.io/projected/5f875d21-ddf2-4d41-8be3-819c8836824a-kube-api-access-z7qw9\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.621435 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5flzf\" (UniqueName: \"kubernetes.io/projected/6e93d5ac-22fb-4d53-86c4-3262993f2116-kube-api-access-5flzf\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.644947 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llblr\" (UniqueName: \"kubernetes.io/projected/918ba01d-c786-4f9a-ae58-5bcc23684c16-kube-api-access-llblr\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.651046 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.651492 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.151455868 +0000 UTC m=+256.726200748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.651556 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.652816 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.152797625 +0000 UTC m=+256.727542495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.662753 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5h99\" (UniqueName: \"kubernetes.io/projected/6cdf835c-58e4-4297-a247-690f407af22d-kube-api-access-n5h99\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.683243 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zpgn\" (UniqueName: \"kubernetes.io/projected/a2907797-7fb3-44c0-81cf-783512fd1bf6-kube-api-access-4zpgn\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.720974 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztqc4\" (UniqueName: \"kubernetes.io/projected/f06790e0-cf8c-48f0-8d48-893663fdbd1c-kube-api-access-ztqc4\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.745148 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.748133 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7467j\" (UniqueName: \"kubernetes.io/projected/0a99ad6c-7819-4b33-8846-26e6ede5ce22-kube-api-access-7467j\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.753651 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.753817 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.253790891 +0000 UTC m=+256.828535741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.754060 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.754435 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.254418557 +0000 UTC m=+256.829163417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.769942 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5gzz\" (UniqueName: \"kubernetes.io/projected/46167450-7100-4ac9-a9dd-e678eb3d8677-kube-api-access-b5gzz\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.772592 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.777574 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.786958 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.788330 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fgmb\" (UniqueName: \"kubernetes.io/projected/ef74c17c-eb2a-4bef-b948-6b06efd76719-kube-api-access-2fgmb\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.796823 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.808298 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r5f7\" (UniqueName: \"kubernetes.io/projected/80d2d01c-2b8d-49ff-adad-6b49568293a0-kube-api-access-8r5f7\") pod \"multus-admission-controller-857f4d67dd-gkpf4\" (UID: \"80d2d01c-2b8d-49ff-adad-6b49568293a0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.809499 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.811619 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.819618 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9n22\" (UniqueName: \"kubernetes.io/projected/97ee6937-a1a5-42ea-a460-29d54478e633-kube-api-access-h9n22\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.838066 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.841924 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njhmj\" (UniqueName: \"kubernetes.io/projected/6c0fd619-d1c2-45e7-a7cf-e784b082428f-kube-api-access-njhmj\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.844997 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.855265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.856525 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.356507023 +0000 UTC m=+256.931251863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.857367 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.859928 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5n2d\" (UniqueName: \"kubernetes.io/projected/08b964cf-bfc5-4b90-83a3-0b358c3ffbc9-kube-api-access-p5n2d\") pod \"downloads-7954f5f757-tnw27\" (UID: \"08b964cf-bfc5-4b90-83a3-0b358c3ffbc9\") " pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.874065 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.880447 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcvjk\" (UniqueName: \"kubernetes.io/projected/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-kube-api-access-xcvjk\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.901496 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxpv5\" (UniqueName: \"kubernetes.io/projected/db81860d-bcb7-4a56-a935-544dbc4be29b-kube-api-access-jxpv5\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.921469 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f5nq\" (UniqueName: \"kubernetes.io/projected/c0f3c490-ee49-4a88-893e-132592dd6d59-kube-api-access-7f5nq\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.940010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c0fd619-d1c2-45e7-a7cf-e784b082428f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.945946 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.959226 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.959698 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.459685817 +0000 UTC m=+257.034430657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.973755 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt67h\" (UniqueName: \"kubernetes.io/projected/e24d15f2-56e5-4fcc-91ab-370d7b4fb41e-kube-api-access-dt67h\") pod \"migrator-59844c95c7-jbw52\" (UID: \"e24d15f2-56e5-4fcc-91ab-370d7b4fb41e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.976794 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.984978 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4699k\" (UniqueName: \"kubernetes.io/projected/c851677f-703c-404c-801c-064cc6bf3979-kube-api-access-4699k\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.999728 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-smtz9"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.006977 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55w8z\" (UniqueName: \"kubernetes.io/projected/c3be356e-94af-47db-a182-dd8a57024619-kube-api-access-55w8z\") pod \"auto-csr-approver-29563746-b66f7\" (UID: \"c3be356e-94af-47db-a182-dd8a57024619\") " pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.020667 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmtc\" (UniqueName: \"kubernetes.io/projected/6d6ab3a6-da16-4fc8-9235-2c223661de30-kube-api-access-swmtc\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.033685 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:30 crc kubenswrapper[4778]: W0318 09:06:30.038501 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf06790e0_cf8c_48f0_8d48_893663fdbd1c.slice/crio-afb06ad3e0bcd75590f17c823bc1aff440fe73faa5a7fd84f4912126237879bd WatchSource:0}: Error finding container afb06ad3e0bcd75590f17c823bc1aff440fe73faa5a7fd84f4912126237879bd: Status 404 returned error can't find the container with id afb06ad3e0bcd75590f17c823bc1aff440fe73faa5a7fd84f4912126237879bd Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.038958 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbvm\" (UniqueName: \"kubernetes.io/projected/0b244100-6e88-4ba2-b656-83b6e31d23c8-kube-api-access-vjbvm\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.052341 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.060366 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.060902 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.560882288 +0000 UTC m=+257.135627128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.065358 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df4jz\" (UniqueName: \"kubernetes.io/projected/496a64ab-b670-4201-9238-d60415ccba17-kube-api-access-df4jz\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.073124 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.085892 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjwps\" (UniqueName: \"kubernetes.io/projected/54961f10-93b0-433f-8a7d-b30d69178e9a-kube-api-access-zjwps\") pod \"auto-csr-approver-29563744-btdt7\" (UID: \"54961f10-93b0-433f-8a7d-b30d69178e9a\") " pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.101988 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.103318 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qtbr\" (UniqueName: \"kubernetes.io/projected/ba84f396-0169-4d5e-a126-60ac9d6d49f8-kube-api-access-7qtbr\") pod \"control-plane-machine-set-operator-78cbb6b69f-qtggn\" (UID: \"ba84f396-0169-4d5e-a126-60ac9d6d49f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.118714 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.120002 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.120059 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.128846 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcgvq\" (UniqueName: \"kubernetes.io/projected/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-kube-api-access-jcgvq\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.129170 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.138028 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.146263 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq5ms\" (UniqueName: \"kubernetes.io/projected/d4d183f7-2762-458d-83f1-a8894c00bb82-kube-api-access-sq5ms\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.151423 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.151510 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.161758 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.162383 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.662371957 +0000 UTC m=+257.237116797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.164520 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.178785 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b244100-6e88-4ba2-b656-83b6e31d23c8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.179771 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjfwr\" (UniqueName: \"kubernetes.io/projected/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-kube-api-access-xjfwr\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.181630 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.224214 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.241417 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pgsqh"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.241767 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.252277 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.260150 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263095 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263312 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75e6cce3-f879-4c6b-8ef3-8d2a4feecae1-metrics-tls\") pod \"dns-operator-744455d44c-mlr7l\" (UID: \"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263339 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx9dg\" (UniqueName: \"kubernetes.io/projected/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-kube-api-access-qx9dg\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263362 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm74g\" (UniqueName: \"kubernetes.io/projected/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-kube-api-access-tm74g\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263389 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-mountpoint-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263418 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd030212-5b03-4555-b885-388260b53588-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263434 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/533b6e54-efa5-4032-bebd-eedc39a834b8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263468 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsjnk\" (UniqueName: \"kubernetes.io/projected/de938bf1-1696-46c9-b6af-9a3766846e8d-kube-api-access-tsjnk\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263486 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b84z\" (UniqueName: \"kubernetes.io/projected/533b6e54-efa5-4032-bebd-eedc39a834b8-kube-api-access-8b84z\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263512 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-profile-collector-cert\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263529 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/577d365f-ec95-4de4-a6a4-6752b2f0de56-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-txb7s\" (UID: \"577d365f-ec95-4de4-a6a4-6752b2f0de56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263544 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd030212-5b03-4555-b885-388260b53588-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263568 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/533b6e54-efa5-4032-bebd-eedc39a834b8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263587 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-registration-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/de938bf1-1696-46c9-b6af-9a3766846e8d-certs\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263618 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/de938bf1-1696-46c9-b6af-9a3766846e8d-node-bootstrap-token\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263634 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263651 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-proxy-tls\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263675 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqc9n\" (UniqueName: \"kubernetes.io/projected/577d365f-ec95-4de4-a6a4-6752b2f0de56-kube-api-access-pqc9n\") pod \"cluster-samples-operator-665b6dd947-txb7s\" (UID: \"577d365f-ec95-4de4-a6a4-6752b2f0de56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263693 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk4m9\" (UniqueName: \"kubernetes.io/projected/75e6cce3-f879-4c6b-8ef3-8d2a4feecae1-kube-api-access-fk4m9\") pod \"dns-operator-744455d44c-mlr7l\" (UID: \"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263707 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd030212-5b03-4555-b885-388260b53588-config\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263742 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-config-volume\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263779 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-plugins-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263796 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z725k\" (UniqueName: \"kubernetes.io/projected/bead92a8-42de-4171-9c0c-790d64a6d14a-kube-api-access-z725k\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263812 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2gwc\" (UniqueName: \"kubernetes.io/projected/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-kube-api-access-l2gwc\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263827 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-csi-data-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263864 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-images\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263887 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/760c21cc-aac0-45ad-9d41-94ff93b92c44-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vgscx\" (UID: \"760c21cc-aac0-45ad-9d41-94ff93b92c44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263904 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/650a32b4-d961-4805-8521-f1f24de6ad4a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263922 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-socket-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263937 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/650a32b4-d961-4805-8521-f1f24de6ad4a-config\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263956 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-metrics-tls\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.264005 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/650a32b4-d961-4805-8521-f1f24de6ad4a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.264038 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55g67\" (UniqueName: \"kubernetes.io/projected/760c21cc-aac0-45ad-9d41-94ff93b92c44-kube-api-access-55g67\") pod \"package-server-manager-789f6589d5-vgscx\" (UID: \"760c21cc-aac0-45ad-9d41-94ff93b92c44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.264081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-srv-cert\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.264793 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.764698708 +0000 UTC m=+257.339443548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.283019 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.297072 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92ctw\" (UniqueName: \"kubernetes.io/projected/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-kube-api-access-92ctw\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:30 crc kubenswrapper[4778]: W0318 09:06:30.357789 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a8d0909_d7da_49bd_bd5b_0f3ca5a61637.slice/crio-d169670b40d53da31764b4cc4e5df28c3ae9d827ab8db481e9ff9be957e4d6a9 WatchSource:0}: Error finding container d169670b40d53da31764b4cc4e5df28c3ae9d827ab8db481e9ff9be957e4d6a9: Status 404 returned error can't find the container with id d169670b40d53da31764b4cc4e5df28c3ae9d827ab8db481e9ff9be957e4d6a9 Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.364869 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-images\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365013 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/760c21cc-aac0-45ad-9d41-94ff93b92c44-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vgscx\" (UID: \"760c21cc-aac0-45ad-9d41-94ff93b92c44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365044 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/650a32b4-d961-4805-8521-f1f24de6ad4a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365068 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-socket-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365083 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/650a32b4-d961-4805-8521-f1f24de6ad4a-config\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365133 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-metrics-tls\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365170 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/650a32b4-d961-4805-8521-f1f24de6ad4a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365243 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55g67\" (UniqueName: \"kubernetes.io/projected/760c21cc-aac0-45ad-9d41-94ff93b92c44-kube-api-access-55g67\") pod \"package-server-manager-789f6589d5-vgscx\" (UID: \"760c21cc-aac0-45ad-9d41-94ff93b92c44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365321 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-srv-cert\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365417 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75e6cce3-f879-4c6b-8ef3-8d2a4feecae1-metrics-tls\") pod \"dns-operator-744455d44c-mlr7l\" (UID: \"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365438 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm74g\" (UniqueName: \"kubernetes.io/projected/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-kube-api-access-tm74g\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365487 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx9dg\" (UniqueName: \"kubernetes.io/projected/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-kube-api-access-qx9dg\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365513 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-mountpoint-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365550 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd030212-5b03-4555-b885-388260b53588-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365569 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/533b6e54-efa5-4032-bebd-eedc39a834b8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365598 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsjnk\" (UniqueName: \"kubernetes.io/projected/de938bf1-1696-46c9-b6af-9a3766846e8d-kube-api-access-tsjnk\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365633 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b84z\" (UniqueName: \"kubernetes.io/projected/533b6e54-efa5-4032-bebd-eedc39a834b8-kube-api-access-8b84z\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365654 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-profile-collector-cert\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365693 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/577d365f-ec95-4de4-a6a4-6752b2f0de56-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-txb7s\" (UID: \"577d365f-ec95-4de4-a6a4-6752b2f0de56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365714 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd030212-5b03-4555-b885-388260b53588-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365731 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/533b6e54-efa5-4032-bebd-eedc39a834b8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365785 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-registration-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365803 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/de938bf1-1696-46c9-b6af-9a3766846e8d-certs\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365840 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/de938bf1-1696-46c9-b6af-9a3766846e8d-node-bootstrap-token\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365879 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365908 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-proxy-tls\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365925 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqc9n\" (UniqueName: \"kubernetes.io/projected/577d365f-ec95-4de4-a6a4-6752b2f0de56-kube-api-access-pqc9n\") pod \"cluster-samples-operator-665b6dd947-txb7s\" (UID: \"577d365f-ec95-4de4-a6a4-6752b2f0de56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk4m9\" (UniqueName: \"kubernetes.io/projected/75e6cce3-f879-4c6b-8ef3-8d2a4feecae1-kube-api-access-fk4m9\") pod \"dns-operator-744455d44c-mlr7l\" (UID: \"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365997 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd030212-5b03-4555-b885-388260b53588-config\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.366056 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-config-volume\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.366073 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-plugins-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.366090 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z725k\" (UniqueName: \"kubernetes.io/projected/bead92a8-42de-4171-9c0c-790d64a6d14a-kube-api-access-z725k\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.366127 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2gwc\" (UniqueName: \"kubernetes.io/projected/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-kube-api-access-l2gwc\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.366142 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-csi-data-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.373248 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/650a32b4-d961-4805-8521-f1f24de6ad4a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.380727 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-socket-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.381443 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/650a32b4-d961-4805-8521-f1f24de6ad4a-config\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.383627 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.384211 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-mountpoint-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.387086 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.387424 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-csi-data-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.387928 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-registration-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.391352 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-config-volume\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.392372 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/760c21cc-aac0-45ad-9d41-94ff93b92c44-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vgscx\" (UID: \"760c21cc-aac0-45ad-9d41-94ff93b92c44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.393104 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.393751 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.8937253 +0000 UTC m=+257.468470130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.393787 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/533b6e54-efa5-4032-bebd-eedc39a834b8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.393892 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-images\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.394956 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-plugins-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.396997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd030212-5b03-4555-b885-388260b53588-config\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.397215 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.407031 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/de938bf1-1696-46c9-b6af-9a3766846e8d-certs\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.413409 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bhmq7"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.415085 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-srv-cert\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.417864 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-metrics-tls\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.418500 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.423803 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-proxy-tls\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.427838 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-profile-collector-cert\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.428771 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/650a32b4-d961-4805-8521-f1f24de6ad4a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.430690 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/533b6e54-efa5-4032-bebd-eedc39a834b8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.431846 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd030212-5b03-4555-b885-388260b53588-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.432222 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/577d365f-ec95-4de4-a6a4-6752b2f0de56-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-txb7s\" (UID: \"577d365f-ec95-4de4-a6a4-6752b2f0de56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.434731 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75e6cce3-f879-4c6b-8ef3-8d2a4feecae1-metrics-tls\") pod \"dns-operator-744455d44c-mlr7l\" (UID: \"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.440712 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx9dg\" (UniqueName: \"kubernetes.io/projected/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-kube-api-access-qx9dg\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.441066 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/de938bf1-1696-46c9-b6af-9a3766846e8d-node-bootstrap-token\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.460265 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd030212-5b03-4555-b885-388260b53588-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.468172 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.468367 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.968333214 +0000 UTC m=+257.543078054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.469400 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.469861 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.969850324 +0000 UTC m=+257.544595244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.479125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55g67\" (UniqueName: \"kubernetes.io/projected/760c21cc-aac0-45ad-9d41-94ff93b92c44-kube-api-access-55g67\") pod \"package-server-manager-789f6589d5-vgscx\" (UID: \"760c21cc-aac0-45ad-9d41-94ff93b92c44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.490308 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" event={"ID":"8ed2be9e-a493-4ce8-aee1-83f3ae258fba","Type":"ContainerStarted","Data":"384012e39609968e40f00b85401f9373f0fcd56486e57e57ec0ce3bc4aaa8163"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.491590 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" event={"ID":"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637","Type":"ContainerStarted","Data":"d169670b40d53da31764b4cc4e5df28c3ae9d827ab8db481e9ff9be957e4d6a9"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.493392 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pgsqh" event={"ID":"5f875d21-ddf2-4d41-8be3-819c8836824a","Type":"ContainerStarted","Data":"0526269f3752c495953fd88d5da903a92103220f8039ec4c7dde34390b5f6401"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.494917 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" event={"ID":"f06790e0-cf8c-48f0-8d48-893663fdbd1c","Type":"ContainerStarted","Data":"399d10bfc0ba0e60f2a6e2b19d51a033e4c3420b3329b501676c4889b0d1fe5e"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.494945 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" event={"ID":"f06790e0-cf8c-48f0-8d48-893663fdbd1c","Type":"ContainerStarted","Data":"afb06ad3e0bcd75590f17c823bc1aff440fe73faa5a7fd84f4912126237879bd"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.495868 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nnfvg" event={"ID":"9b31b04d-28d1-4397-88b3-b26a4bb6ede9","Type":"ContainerStarted","Data":"953c1bb92c70403222e509c171da1d740d9321c8dc5b21025da97debd2f8ca77"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.497665 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" event={"ID":"35cf99cc-0bae-4b8d-b861-103e3174f081","Type":"ContainerStarted","Data":"0f3e2cd60735c7ed34516ddbf5173175e86cd453f29e51936da898bfa27f9a01"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.499542 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" event={"ID":"d0167d9e-5565-4154-80bb-3856d9b5985f","Type":"ContainerStarted","Data":"248e2d30296bd1e1d92d5277ca75c197cf7655345299c525ecce91477b057af8"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.500563 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqc9n\" (UniqueName: \"kubernetes.io/projected/577d365f-ec95-4de4-a6a4-6752b2f0de56-kube-api-access-pqc9n\") pod \"cluster-samples-operator-665b6dd947-txb7s\" (UID: \"577d365f-ec95-4de4-a6a4-6752b2f0de56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.511005 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.522016 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z725k\" (UniqueName: \"kubernetes.io/projected/bead92a8-42de-4171-9c0c-790d64a6d14a-kube-api-access-z725k\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.541746 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.558185 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.559994 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.570646 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.570976 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsjnk\" (UniqueName: \"kubernetes.io/projected/de938bf1-1696-46c9-b6af-9a3766846e8d-kube-api-access-tsjnk\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.571093 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.071071057 +0000 UTC m=+257.645815987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.578315 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm74g\" (UniqueName: \"kubernetes.io/projected/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-kube-api-access-tm74g\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.580056 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.599646 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b84z\" (UniqueName: \"kubernetes.io/projected/533b6e54-efa5-4032-bebd-eedc39a834b8-kube-api-access-8b84z\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.622142 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2gwc\" (UniqueName: \"kubernetes.io/projected/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-kube-api-access-l2gwc\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.626635 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk4m9\" (UniqueName: \"kubernetes.io/projected/75e6cce3-f879-4c6b-8ef3-8d2a4feecae1-kube-api-access-fk4m9\") pod \"dns-operator-744455d44c-mlr7l\" (UID: \"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.660227 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.665951 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.672867 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.673824 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.173812109 +0000 UTC m=+257.748556949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.681720 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.687359 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.734963 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.774508 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.774724 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.274685391 +0000 UTC m=+257.849430231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.774917 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.777813 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.277790475 +0000 UTC m=+257.852535315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.832475 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.852532 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.870933 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.876644 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.877005 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.376983303 +0000 UTC m=+257.951728133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.877130 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.877481 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.377474076 +0000 UTC m=+257.952218916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.978152 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.978493 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.478444881 +0000 UTC m=+258.053189721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.978975 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.979370 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.479349895 +0000 UTC m=+258.054094735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.090917 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.091388 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.591358118 +0000 UTC m=+258.166102958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.091627 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.091947 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.591939624 +0000 UTC m=+258.166684464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.152946 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q7qs8"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.161396 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.178064 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gkpf4"] Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.193766 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.693734881 +0000 UTC m=+258.268479721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.192183 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.196720 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.197048 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.69703583 +0000 UTC m=+258.271780670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.213414 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6frtc"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.301631 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.303030 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.80300753 +0000 UTC m=+258.377752370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.354812 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ckp9s"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.373279 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.404585 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.405002 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.904989882 +0000 UTC m=+258.479734722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.510211 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.510564 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.010546781 +0000 UTC m=+258.585291621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.571092 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" event={"ID":"6cdf835c-58e4-4297-a247-690f407af22d","Type":"ContainerStarted","Data":"ac90e6e39e7a5f446606c85ba56fd804f2a5f81ecff1fa3d0efe068009b15849"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.582479 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" event={"ID":"8ed2be9e-a493-4ce8-aee1-83f3ae258fba","Type":"ContainerStarted","Data":"e4732188769d810ad5cc0b5c077b4adea2c1620a874dd07a965859f325d7cd4e"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.588587 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" event={"ID":"97ee6937-a1a5-42ea-a460-29d54478e633","Type":"ContainerStarted","Data":"ea27939aa8b795cbb05c9ef86fd0c0cedd8519701e620ecc91f86b4b95a08fc2"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.598272 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" event={"ID":"0a99ad6c-7819-4b33-8846-26e6ede5ce22","Type":"ContainerStarted","Data":"e27e08ba37ccea769f54c163cfe33287adcf772ae3e9703fa688112dc81941d0"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.607214 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" event={"ID":"f06790e0-cf8c-48f0-8d48-893663fdbd1c","Type":"ContainerStarted","Data":"83ed3cd2c88577c4a8ab7718302c6c52e3fe63ca877b71c85b312c9ae9680692"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.615437 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.615929 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.115911915 +0000 UTC m=+258.690656755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.619264 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" event={"ID":"80d2d01c-2b8d-49ff-adad-6b49568293a0","Type":"ContainerStarted","Data":"11d027c4ff144e592c2325bf8ed82a69537026248fc53f326a958432c477becd"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.625647 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nnfvg" event={"ID":"9b31b04d-28d1-4397-88b3-b26a4bb6ede9","Type":"ContainerStarted","Data":"d7ad50204a6e2cd1c8da5c24755ef358bd43e98cde4c732ef742b2d21292299a"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.637714 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" event={"ID":"034ec244-f99c-4c50-a55a-9b33b8b376c3","Type":"ContainerStarted","Data":"47bde142c5dd33c66938dfe740b25e3625d59cc1c96c745b6bc2929fed2803f2"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.637773 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" event={"ID":"034ec244-f99c-4c50-a55a-9b33b8b376c3","Type":"ContainerStarted","Data":"a39d0575f5d97e39ab3f8a698d70640fed87fbd9c05fe240af2a6f1ce398a1fc"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.640477 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pgsqh" event={"ID":"5f875d21-ddf2-4d41-8be3-819c8836824a","Type":"ContainerStarted","Data":"f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.644620 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qqzxx" event={"ID":"de938bf1-1696-46c9-b6af-9a3766846e8d","Type":"ContainerStarted","Data":"a00499b5b2a5d6bcc09c5e31a017b82e8c9cc19431c4f72d82b3e51c6f842948"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.644689 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qqzxx" event={"ID":"de938bf1-1696-46c9-b6af-9a3766846e8d","Type":"ContainerStarted","Data":"bf99e3d12d16acdd22819c226785c08c6b53925cef7c467eb21d02d826f550b2"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.647265 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" event={"ID":"6e93d5ac-22fb-4d53-86c4-3262993f2116","Type":"ContainerStarted","Data":"6c27a915fd14451a692cf576af43979df23ddbd5e61d6a727e34a13eea5af811"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.655967 4778 generic.go:334] "Generic (PLEG): container finished" podID="35cf99cc-0bae-4b8d-b861-103e3174f081" containerID="86393bf25c39867325647f99ee03a38d004b1fe8ea252a33b08e0f4463a8615e" exitCode=0 Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.656787 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" event={"ID":"35cf99cc-0bae-4b8d-b861-103e3174f081","Type":"ContainerDied","Data":"86393bf25c39867325647f99ee03a38d004b1fe8ea252a33b08e0f4463a8615e"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.658758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" event={"ID":"918ba01d-c786-4f9a-ae58-5bcc23684c16","Type":"ContainerStarted","Data":"1c73c96ec7be59ec8398b94c1b9eef7bba25efc85f846fdcd30027fb38409c08"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.659335 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.660947 4778 patch_prober.go:28] interesting pod/console-operator-58897d9998-q7qs8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.661466 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" podUID="918ba01d-c786-4f9a-ae58-5bcc23684c16" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.669248 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" event={"ID":"a2907797-7fb3-44c0-81cf-783512fd1bf6","Type":"ContainerStarted","Data":"afbb41bf570d58818b80b3649704135d217cda0a15059129ddd38187b552fed1"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.669286 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" event={"ID":"a2907797-7fb3-44c0-81cf-783512fd1bf6","Type":"ContainerStarted","Data":"034d3d04512c50babfedd8291845f3b40baf1bac18820ecad81e04f9283eddf2"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.693184 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" event={"ID":"d0167d9e-5565-4154-80bb-3856d9b5985f","Type":"ContainerStarted","Data":"c54ebb09077fd76f31682fa7bdc8047b8510156410fb4d70d68ac82f9573d624"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.698379 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" event={"ID":"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637","Type":"ContainerStarted","Data":"e9257ea8202297fa6d8a75a0ad4f78bf4d5ee04b58490bb83b63980f5daf0cc8"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.698989 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.700182 4778 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-dsqlz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.700248 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" podUID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.717265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.717546 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.217526677 +0000 UTC m=+258.792271517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.717990 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.719452 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.219431648 +0000 UTC m=+258.794176498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.755940 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs"] Mar 18 09:06:31 crc kubenswrapper[4778]: W0318 09:06:31.782792 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef74c17c_eb2a_4bef_b948_6b06efd76719.slice/crio-1f29d37f1112fc081e73de16be0afe9b8e06f130904067805bb112ee4fe30e1c WatchSource:0}: Error finding container 1f29d37f1112fc081e73de16be0afe9b8e06f130904067805bb112ee4fe30e1c: Status 404 returned error can't find the container with id 1f29d37f1112fc081e73de16be0afe9b8e06f130904067805bb112ee4fe30e1c Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.823316 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.823821 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.824187 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.324165044 +0000 UTC m=+258.898909884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.850780 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvvlz"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.883069 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tnw27"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.908312 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563744-btdt7"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.929419 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.931213 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.431181142 +0000 UTC m=+259.005925982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.959956 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5dpv"] Mar 18 09:06:31 crc kubenswrapper[4778]: W0318 09:06:31.966387 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08b964cf_bfc5_4b90_83a3_0b358c3ffbc9.slice/crio-375a8c353b029593bb91ef23f773207b5d24c55a6f2c8d8b5b46e3882a7bf7d3 WatchSource:0}: Error finding container 375a8c353b029593bb91ef23f773207b5d24c55a6f2c8d8b5b46e3882a7bf7d3: Status 404 returned error can't find the container with id 375a8c353b029593bb91ef23f773207b5d24c55a6f2c8d8b5b46e3882a7bf7d3 Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.984305 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-57msj"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.994913 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.006286 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563746-b66f7"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.030096 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.030578 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.530559375 +0000 UTC m=+259.105304215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.032367 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.036260 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.052115 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w2lf2"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.063598 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.095819 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hr48"] Mar 18 09:06:32 crc kubenswrapper[4778]: W0318 09:06:32.120913 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba84f396_0169_4d5e_a126_60ac9d6d49f8.slice/crio-e9cf4ed56d51f329fa6bbc3fc0a2995ce8217e5643b5fbead6c9478307157c13 WatchSource:0}: Error finding container e9cf4ed56d51f329fa6bbc3fc0a2995ce8217e5643b5fbead6c9478307157c13: Status 404 returned error can't find the container with id e9cf4ed56d51f329fa6bbc3fc0a2995ce8217e5643b5fbead6c9478307157c13 Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.133575 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.136671 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.140370 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.640352588 +0000 UTC m=+259.215097428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.146682 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.146764 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.158016 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.175791 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jmmm2"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.177182 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.213620 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-pgsqh" podStartSLOduration=188.213585324 podStartE2EDuration="3m8.213585324s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.187444599 +0000 UTC m=+258.762189449" watchObservedRunningTime="2026-03-18 09:06:32.213585324 +0000 UTC m=+258.788330164" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.219165 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" podStartSLOduration=188.219148334 podStartE2EDuration="3m8.219148334s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.215103134 +0000 UTC m=+258.789847974" watchObservedRunningTime="2026-03-18 09:06:32.219148334 +0000 UTC m=+258.793893174" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.238187 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.240362 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.740331755 +0000 UTC m=+259.315076595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.243110 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.243446 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.74343168 +0000 UTC m=+259.318176510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.247562 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.247675 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.247789 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5gdpq"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.247853 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.247931 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.247998 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.254343 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mlr7l"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.254553 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.261147 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qqzxx" podStartSLOduration=5.261120467 podStartE2EDuration="5.261120467s" podCreationTimestamp="2026-03-18 09:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.25234496 +0000 UTC m=+258.827089800" watchObservedRunningTime="2026-03-18 09:06:32.261120467 +0000 UTC m=+258.835865317" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.267879 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:32 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:32 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:32 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.267963 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.276318 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgbcp"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.298909 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" podStartSLOduration=188.298886355 podStartE2EDuration="3m8.298886355s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.297703794 +0000 UTC m=+258.872448634" watchObservedRunningTime="2026-03-18 09:06:32.298886355 +0000 UTC m=+258.873631195" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.344777 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.345177 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.845156755 +0000 UTC m=+259.419901585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: W0318 09:06:32.371736 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75e6cce3_f879_4c6b_8ef3_8d2a4feecae1.slice/crio-5f8e6fb57b55e014fff4bf0ebda0361d7586a0fec3186d8e1e4e58a550358063 WatchSource:0}: Error finding container 5f8e6fb57b55e014fff4bf0ebda0361d7586a0fec3186d8e1e4e58a550358063: Status 404 returned error can't find the container with id 5f8e6fb57b55e014fff4bf0ebda0361d7586a0fec3186d8e1e4e58a550358063 Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.379686 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" podStartSLOduration=187.379667155 podStartE2EDuration="3m7.379667155s" podCreationTimestamp="2026-03-18 09:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.379299086 +0000 UTC m=+258.954043936" watchObservedRunningTime="2026-03-18 09:06:32.379667155 +0000 UTC m=+258.954411995" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.381006 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-nnfvg" podStartSLOduration=188.381000832 podStartE2EDuration="3m8.381000832s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.338176016 +0000 UTC m=+258.912920866" watchObservedRunningTime="2026-03-18 09:06:32.381000832 +0000 UTC m=+258.955745672" Mar 18 09:06:32 crc kubenswrapper[4778]: W0318 09:06:32.389578 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e4e3b9_bd77_47d8_98d7_f79849a3fc4a.slice/crio-27677d79c821d5cb3c27fcc6e8291c62636ee81cb359205c816e39f5589c41d5 WatchSource:0}: Error finding container 27677d79c821d5cb3c27fcc6e8291c62636ee81cb359205c816e39f5589c41d5: Status 404 returned error can't find the container with id 27677d79c821d5cb3c27fcc6e8291c62636ee81cb359205c816e39f5589c41d5 Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.446854 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.447475 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.947462525 +0000 UTC m=+259.522207365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.456091 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" podStartSLOduration=188.456073577 podStartE2EDuration="3m8.456073577s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.42170786 +0000 UTC m=+258.996452730" watchObservedRunningTime="2026-03-18 09:06:32.456073577 +0000 UTC m=+259.030818417" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.456873 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" podStartSLOduration=188.456869509 podStartE2EDuration="3m8.456869509s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.455829351 +0000 UTC m=+259.030574211" watchObservedRunningTime="2026-03-18 09:06:32.456869509 +0000 UTC m=+259.031614349" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.540812 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" podStartSLOduration=188.540793194 podStartE2EDuration="3m8.540793194s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.536328353 +0000 UTC m=+259.111073203" watchObservedRunningTime="2026-03-18 09:06:32.540793194 +0000 UTC m=+259.115538034" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.547663 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.548164 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.048143242 +0000 UTC m=+259.622888082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.650491 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.651357 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.151308166 +0000 UTC m=+259.726053026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.708706 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" event={"ID":"97ee6937-a1a5-42ea-a460-29d54478e633","Type":"ContainerStarted","Data":"f1aaa8a2c1f96baaee4b7353f353a9b567635ea9eb73df19ffa50153f00a757d"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.728264 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" event={"ID":"c0f3c490-ee49-4a88-893e-132592dd6d59","Type":"ContainerStarted","Data":"e15756b13b34411edcc3f2b0d1a8832dacf6a1d93648315a7091e044b914508a"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.732945 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" podStartSLOduration=188.732931979 podStartE2EDuration="3m8.732931979s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.730497973 +0000 UTC m=+259.305242823" watchObservedRunningTime="2026-03-18 09:06:32.732931979 +0000 UTC m=+259.307676819" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.741640 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" event={"ID":"e24d15f2-56e5-4fcc-91ab-370d7b4fb41e","Type":"ContainerStarted","Data":"d956f9a5d6b254e8d728e6b23e7ef10dfcb87defb8a9a961c3869358ea36fed7"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.752569 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.752784 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.252744984 +0000 UTC m=+259.827489824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.753415 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.755385 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.255367324 +0000 UTC m=+259.830112154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.790906 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" event={"ID":"6d6ab3a6-da16-4fc8-9235-2c223661de30","Type":"ContainerStarted","Data":"6f9d22309b6e8be92e6bf6a2a403f7867a0fd457314b02c7511d1124370512b5"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.790972 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" event={"ID":"6d6ab3a6-da16-4fc8-9235-2c223661de30","Type":"ContainerStarted","Data":"4f106af8b6fde6e29b85df853162f89730389f4a7ad45e58e6bc697664ae04b5"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.822922 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.824032 4778 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5vjr4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.824092 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" podUID="6d6ab3a6-da16-4fc8-9235-2c223661de30" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.831416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" event={"ID":"650a32b4-d961-4805-8521-f1f24de6ad4a","Type":"ContainerStarted","Data":"6ba048943d6cee093680f080207cfd2aadce6c1b8841ea5a02e9fd700cadc182"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.852031 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" podStartSLOduration=188.852015203 podStartE2EDuration="3m8.852015203s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.851111598 +0000 UTC m=+259.425856448" watchObservedRunningTime="2026-03-18 09:06:32.852015203 +0000 UTC m=+259.426760043" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.854253 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" event={"ID":"496a64ab-b670-4201-9238-d60415ccba17","Type":"ContainerStarted","Data":"f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.854324 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" event={"ID":"496a64ab-b670-4201-9238-d60415ccba17","Type":"ContainerStarted","Data":"eca604f85fcfe59c73e6a0d9a12120a2d10108fa64f5bc3107b7c718f96ed398"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.854893 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.855157 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.855285 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.355266211 +0000 UTC m=+259.930011051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.855908 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.857075 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.35706186 +0000 UTC m=+259.931806700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.859797 4778 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hvvlz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.859842 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" podUID="496a64ab-b670-4201-9238-d60415ccba17" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.860717 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" event={"ID":"d4d183f7-2762-458d-83f1-a8894c00bb82","Type":"ContainerStarted","Data":"2c72c2692ffbed3ea96bf7a41ea070b572a03640ab6550fc6a2c2b0a29b92e70"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.866211 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" event={"ID":"918ba01d-c786-4f9a-ae58-5bcc23684c16","Type":"ContainerStarted","Data":"2850f8d396e9ed13b8edb495152385907e96097a7180aa00b5627e55e496bedf"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.874831 4778 patch_prober.go:28] interesting pod/console-operator-58897d9998-q7qs8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.875275 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" podUID="918ba01d-c786-4f9a-ae58-5bcc23684c16" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.877772 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" podStartSLOduration=188.877751918 podStartE2EDuration="3m8.877751918s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.87524902 +0000 UTC m=+259.449993870" watchObservedRunningTime="2026-03-18 09:06:32.877751918 +0000 UTC m=+259.452496758" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.887084 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgbcp" event={"ID":"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a","Type":"ContainerStarted","Data":"27677d79c821d5cb3c27fcc6e8291c62636ee81cb359205c816e39f5589c41d5"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.915385 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" event={"ID":"80d2d01c-2b8d-49ff-adad-6b49568293a0","Type":"ContainerStarted","Data":"8405f133084eb28be0cb40e1d5cc155d1b1a741b67ea35d7d14894293fadf775"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.915435 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" event={"ID":"80d2d01c-2b8d-49ff-adad-6b49568293a0","Type":"ContainerStarted","Data":"72a70033df350c0b2a2f582a79ba90cb64bdc98a6c08453c09e0ca225d0b9999"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.929396 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" event={"ID":"533b6e54-efa5-4032-bebd-eedc39a834b8","Type":"ContainerStarted","Data":"5e4f46b76ad29fc18d28d273fdddf8d428764dfc6d0c15265d6d306626152cee"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.944528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563746-b66f7" event={"ID":"c3be356e-94af-47db-a182-dd8a57024619","Type":"ContainerStarted","Data":"4d957b42c20ebb120c0681574b93e0b852f5977f6c96c78d95883a927b1e8844"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.959314 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.960662 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.460642345 +0000 UTC m=+260.035387185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.966758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563744-btdt7" event={"ID":"54961f10-93b0-433f-8a7d-b30d69178e9a","Type":"ContainerStarted","Data":"df77c4671fb6dc8dc3716ac3d7733190f2f2696ab30319657174e00cec76ec77"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.999844 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" event={"ID":"6c0fd619-d1c2-45e7-a7cf-e784b082428f","Type":"ContainerStarted","Data":"3c28da9a3ac748f9028b1eea9e0023b277c9fe45e8f3ddff6c1db4a612169392"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.999902 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" event={"ID":"6c0fd619-d1c2-45e7-a7cf-e784b082428f","Type":"ContainerStarted","Data":"1e993fccca6419958be3b799f77590edcfea874d835f2a499222ed4fbcb22114"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.002441 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" event={"ID":"9c6e16fb-c90d-4e0d-a57e-90a778a52f97","Type":"ContainerStarted","Data":"cb5c5b7566c5f874889b71b840058c068af4116eeb732133e641ee9b9473101b"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.004256 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" event={"ID":"ef74c17c-eb2a-4bef-b948-6b06efd76719","Type":"ContainerStarted","Data":"b06a880707bff57bc2b5d0e7f2a62ea0d46656d370ddbaf3e6424f3e8c4c804a"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.004322 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" event={"ID":"ef74c17c-eb2a-4bef-b948-6b06efd76719","Type":"ContainerStarted","Data":"1f29d37f1112fc081e73de16be0afe9b8e06f130904067805bb112ee4fe30e1c"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.008169 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" event={"ID":"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1","Type":"ContainerStarted","Data":"5f8e6fb57b55e014fff4bf0ebda0361d7586a0fec3186d8e1e4e58a550358063"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.028265 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" podStartSLOduration=189.028247089 podStartE2EDuration="3m9.028247089s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.944600302 +0000 UTC m=+259.519345162" watchObservedRunningTime="2026-03-18 09:06:33.028247089 +0000 UTC m=+259.602991929" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.028435 4778 generic.go:334] "Generic (PLEG): container finished" podID="6e93d5ac-22fb-4d53-86c4-3262993f2116" containerID="6062ede6f155532278065abaf189fa02e386986f37b7ff7b5d5da49d03d768cd" exitCode=0 Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.029480 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" podStartSLOduration=189.029472012 podStartE2EDuration="3m9.029472012s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.026304527 +0000 UTC m=+259.601049367" watchObservedRunningTime="2026-03-18 09:06:33.029472012 +0000 UTC m=+259.604216862" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.029681 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" event={"ID":"6e93d5ac-22fb-4d53-86c4-3262993f2116","Type":"ContainerDied","Data":"6062ede6f155532278065abaf189fa02e386986f37b7ff7b5d5da49d03d768cd"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.044632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" event={"ID":"35cf99cc-0bae-4b8d-b861-103e3174f081","Type":"ContainerStarted","Data":"a5ca2616e67739d3bb0e01c9dac1e0fb59930ed790b9bd6756bfce05da151ac6"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.046834 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jmmm2" event={"ID":"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d","Type":"ContainerStarted","Data":"89b13df349cffa3c715e5e9ed7a7c27a7e88cddbd2209e8f5bc039de0402bdf6"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.049910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" event={"ID":"0a99ad6c-7819-4b33-8846-26e6ede5ce22","Type":"ContainerStarted","Data":"95ac5b77eda346ee57bd444c0e14e698dddd7a3b6ad33e441397cb407c1eb72b"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.051163 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" event={"ID":"ba84f396-0169-4d5e-a126-60ac9d6d49f8","Type":"ContainerStarted","Data":"e9cf4ed56d51f329fa6bbc3fc0a2995ce8217e5643b5fbead6c9478307157c13"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.060850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" event={"ID":"bead92a8-42de-4171-9c0c-790d64a6d14a","Type":"ContainerStarted","Data":"45df2c1880ea6d858e0415030f4d35fcd594de872512718e5959a7fc551f1216"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.061773 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.062503 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.562488393 +0000 UTC m=+260.137233233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.066613 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tnw27" event={"ID":"08b964cf-bfc5-4b90-83a3-0b358c3ffbc9","Type":"ContainerStarted","Data":"375a8c353b029593bb91ef23f773207b5d24c55a6f2c8d8b5b46e3882a7bf7d3"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.067231 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.070983 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.071026 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.078869 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" event={"ID":"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a","Type":"ContainerStarted","Data":"44fcaa7d9066c5bc322cc3c475c2c95ffa382825c1c11ff0bdf59ba686b15693"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.080109 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.082870 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" event={"ID":"760c21cc-aac0-45ad-9d41-94ff93b92c44","Type":"ContainerStarted","Data":"4e2c9a289270adf41b0f5b354960fc59b8ab509feb18883435be1d79f378229a"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.082923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" event={"ID":"760c21cc-aac0-45ad-9d41-94ff93b92c44","Type":"ContainerStarted","Data":"c032f33bdab4c47b70a3c3fb83a78557194d3681d2fe59a6004d2e5e8bf1bbc5"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.087793 4778 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2hr48 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.087862 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.097291 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" event={"ID":"db81860d-bcb7-4a56-a935-544dbc4be29b","Type":"ContainerStarted","Data":"233e58e62c8d40d87963329725284bd0d629e6646b095fe46a9712b711f0c101"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.104980 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" event={"ID":"dd030212-5b03-4555-b885-388260b53588","Type":"ContainerStarted","Data":"9db719568e55e34bbbcd8687188ce7d4fc887351f3b43cbdeabef0886d1784d1"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.108357 4778 generic.go:334] "Generic (PLEG): container finished" podID="6cdf835c-58e4-4297-a247-690f407af22d" containerID="460b0e8d587363cdd40956bfc336c4113dd9579cf07f3fe9c6ec33feb5cbe8d0" exitCode=0 Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.108415 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" event={"ID":"6cdf835c-58e4-4297-a247-690f407af22d","Type":"ContainerDied","Data":"460b0e8d587363cdd40956bfc336c4113dd9579cf07f3fe9c6ec33feb5cbe8d0"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.117168 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" podStartSLOduration=188.117153328 podStartE2EDuration="3m8.117153328s" podCreationTimestamp="2026-03-18 09:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.116576153 +0000 UTC m=+259.691320993" watchObservedRunningTime="2026-03-18 09:06:33.117153328 +0000 UTC m=+259.691898168" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.142058 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" podStartSLOduration=188.1420402 podStartE2EDuration="3m8.1420402s" podCreationTimestamp="2026-03-18 09:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.141909146 +0000 UTC m=+259.716654006" watchObservedRunningTime="2026-03-18 09:06:33.1420402 +0000 UTC m=+259.716785040" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.149774 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" event={"ID":"46167450-7100-4ac9-a9dd-e678eb3d8677","Type":"ContainerStarted","Data":"c5603b8c9cf3c8d4db15b72d831907b4f4e8bf169cb9a4b076037f037d6f770e"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.160739 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" event={"ID":"c851677f-703c-404c-801c-064cc6bf3979","Type":"ContainerStarted","Data":"a98098a4667e740b4622df5432b1ded5001b5d2b904ba90ebd391d3be8a341e8"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.163376 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.166309 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.666262824 +0000 UTC m=+260.241007674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.182248 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" event={"ID":"8ed2be9e-a493-4ce8-aee1-83f3ae258fba","Type":"ContainerStarted","Data":"3e5630c30cd6b602b691ad31aec81b3cbbbd0649c5615b1bdc162de71319ba73"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.224277 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-tnw27" podStartSLOduration=189.224259519 podStartE2EDuration="3m9.224259519s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.192904172 +0000 UTC m=+259.767649022" watchObservedRunningTime="2026-03-18 09:06:33.224259519 +0000 UTC m=+259.799004359" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.225477 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" podStartSLOduration=189.225470121 podStartE2EDuration="3m9.225470121s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.223250521 +0000 UTC m=+259.797995381" watchObservedRunningTime="2026-03-18 09:06:33.225470121 +0000 UTC m=+259.800214961" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.225753 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" event={"ID":"0b244100-6e88-4ba2-b656-83b6e31d23c8","Type":"ContainerStarted","Data":"5cdf4d5e0704cfdc77fd7807adc7324c637483d9eaea5640cc19ba4e5e95b39b"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.248622 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" podStartSLOduration=189.248603375 podStartE2EDuration="3m9.248603375s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.248142443 +0000 UTC m=+259.822887293" watchObservedRunningTime="2026-03-18 09:06:33.248603375 +0000 UTC m=+259.823348215" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.258664 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:33 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:33 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:33 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.258715 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.259941 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" event={"ID":"577d365f-ec95-4de4-a6a4-6752b2f0de56","Type":"ContainerStarted","Data":"657cdd1a7127ceffd000c9baa751dc82fbbc67cf208f9eae19b01b841a3c1ce2"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.264688 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" event={"ID":"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c","Type":"ContainerStarted","Data":"0c85e5526ff3d3aad837f9a0130f64206266064ce5fbb75b5de406069d316289"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.268149 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.271018 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.77099826 +0000 UTC m=+260.345743100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.276682 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.303491 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" podStartSLOduration=189.303439516 podStartE2EDuration="3m9.303439516s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.29728644 +0000 UTC m=+259.872031280" watchObservedRunningTime="2026-03-18 09:06:33.303439516 +0000 UTC m=+259.878184356" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.369495 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.370912 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.870864145 +0000 UTC m=+260.445608985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.471367 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.472147 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.972120048 +0000 UTC m=+260.546864888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.572723 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.573526 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.073501073 +0000 UTC m=+260.648245923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.574977 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.575611 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.0755983 +0000 UTC m=+260.650343140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.682873 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.683404 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.18338436 +0000 UTC m=+260.758129200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.784407 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.785476 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.285459104 +0000 UTC m=+260.860203954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.892025 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51694: no serving certificate available for the kubelet" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.892874 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.893046 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.393014336 +0000 UTC m=+260.967759176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.893221 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.893537 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.393525091 +0000 UTC m=+260.968269931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.935393 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51700: no serving certificate available for the kubelet" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.988467 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51714: no serving certificate available for the kubelet" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.994177 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.994562 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.494542007 +0000 UTC m=+261.069286847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.095976 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.096440 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.596420146 +0000 UTC m=+261.171165006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.098635 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51724: no serving certificate available for the kubelet" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.196823 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.197389 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.697147495 +0000 UTC m=+261.271892335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.198387 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.199402 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.699390815 +0000 UTC m=+261.274135655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.208725 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51732: no serving certificate available for the kubelet" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.263174 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:34 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:34 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:34 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.263244 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.300917 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.301650 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.801626384 +0000 UTC m=+261.376371224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.353513 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51746: no serving certificate available for the kubelet" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.368370 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" event={"ID":"ba84f396-0169-4d5e-a126-60ac9d6d49f8","Type":"ContainerStarted","Data":"f48719cbac1747bc224535092fc3d4ad7429d42e033cd378c6065bf7d1519bf0"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.403861 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.405027 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.905014514 +0000 UTC m=+261.479759354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.426792 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" event={"ID":"6c0fd619-d1c2-45e7-a7cf-e784b082428f","Type":"ContainerStarted","Data":"e2a2e8f9c8a6f8bc53374676c3f047ded50753ac8397e7b209e932507b5d89ff"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.451034 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" event={"ID":"dd030212-5b03-4555-b885-388260b53588","Type":"ContainerStarted","Data":"ab5434042669231d2fdb14802eec8a0590443d2bd3a9e65c7809088a2d527275"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.458676 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51754: no serving certificate available for the kubelet" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.491510 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" event={"ID":"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a","Type":"ContainerStarted","Data":"90afc0123cf376859383c4f842d0587910af2413023c90f78391ff8bd752a9d6"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.492684 4778 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2hr48 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.492724 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.506877 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.512634 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.012611008 +0000 UTC m=+261.587355848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.540903 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" event={"ID":"760c21cc-aac0-45ad-9d41-94ff93b92c44","Type":"ContainerStarted","Data":"f2e77ec04f3d30cbfdb3b3ad7a2a1c69dced720b5b7ef0ff2e17a63a024918aa"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.541266 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.573960 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jmmm2" event={"ID":"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d","Type":"ContainerStarted","Data":"74c70a3ef48017cd5b04af0e4eaab181c5cbc27e8548f84ceffd37b86c746659"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.581299 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" event={"ID":"46167450-7100-4ac9-a9dd-e678eb3d8677","Type":"ContainerStarted","Data":"a3257953897e0bdf8b7d813090d9374915df516a5d6d321cb7c31e54733e6c34"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.609024 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" event={"ID":"9c6e16fb-c90d-4e0d-a57e-90a778a52f97","Type":"ContainerStarted","Data":"744bb67aaefa8b621cba0537144fe11c395b132cc4169af137251761fd3444de"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.610345 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.614990 4778 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2kmb2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.615043 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" podUID="9c6e16fb-c90d-4e0d-a57e-90a778a52f97" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.615445 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.615694 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.615708 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.617702 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.117689134 +0000 UTC m=+261.692433974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.629337 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" event={"ID":"6e93d5ac-22fb-4d53-86c4-3262993f2116","Type":"ContainerStarted","Data":"81d08bac258f86cd207d727ee8984390039e2d5e7db40deb32f16fc2e0383e62"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.645146 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" event={"ID":"6cdf835c-58e4-4297-a247-690f407af22d","Type":"ContainerStarted","Data":"d6435614874e1ed1701b0989ff36d1da587619d778214ac17a7fda1ce1a089d0"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.645250 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.648251 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51762: no serving certificate available for the kubelet" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.675293 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" event={"ID":"e24d15f2-56e5-4fcc-91ab-370d7b4fb41e","Type":"ContainerStarted","Data":"f149547c282b14060ac7f4f965632cabec14a5eedeb495fbdb314b9a9c453412"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.675356 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" event={"ID":"e24d15f2-56e5-4fcc-91ab-370d7b4fb41e","Type":"ContainerStarted","Data":"76f95dceb354314ec5566a2edabbce64ad7d245b0d1999d6af744d80ece10efc"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.701289 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" event={"ID":"d4d183f7-2762-458d-83f1-a8894c00bb82","Type":"ContainerStarted","Data":"d8e3b8f1eeb4e7cc07f6bcdb3906e975c0dd5306fa698d44586fa07838ab827b"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.702320 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.705494 4778 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8xcgz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.705546 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" podUID="d4d183f7-2762-458d-83f1-a8894c00bb82" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.716970 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.717530 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.217493547 +0000 UTC m=+261.792238387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.728445 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" event={"ID":"0b244100-6e88-4ba2-b656-83b6e31d23c8","Type":"ContainerStarted","Data":"4d6f482e5067ec90c4ea0074fb58358982b91995483dd47289604a192d768c4e"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.734047 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgbcp" event={"ID":"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a","Type":"ContainerStarted","Data":"9085a1b00021acac631526ffa76ad8618e99f3800c908b6b2383d08dc568f119"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.734462 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.758910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" event={"ID":"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c","Type":"ContainerStarted","Data":"8e1aa81dfc8b447ee04c3b269f8dcbf8b941fbb0ec6570237b3b56acc075ba55"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.758994 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" event={"ID":"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c","Type":"ContainerStarted","Data":"ac68bb4122316bd4d7fe7980d410d3c58a422fe23f763a5e088cd2b1cc0e1a91"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.770263 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" event={"ID":"db81860d-bcb7-4a56-a935-544dbc4be29b","Type":"ContainerStarted","Data":"fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.771272 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.773446 4778 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-x5dpv container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.773494 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" podUID="db81860d-bcb7-4a56-a935-544dbc4be29b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.780622 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" event={"ID":"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1","Type":"ContainerStarted","Data":"79704d94de9aa153dd57524e4d810f280451536dd38cc2b552fcc48a1f9872f6"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.785101 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" event={"ID":"577d365f-ec95-4de4-a6a4-6752b2f0de56","Type":"ContainerStarted","Data":"0176028c7410d8701508c5ab47491d104223de9feb14dbb0bacf2f787d088492"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.785131 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" event={"ID":"577d365f-ec95-4de4-a6a4-6752b2f0de56","Type":"ContainerStarted","Data":"29451a09a06a417a40c1535b5d09b9a0a017c7f54f13ec834c02a89e57f76aa2"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.795406 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" event={"ID":"c851677f-703c-404c-801c-064cc6bf3979","Type":"ContainerStarted","Data":"c2a4c69de94a95a7dbd5dc126644dda94b3a295c364349e9470f5db1da3a11af"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.795484 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" event={"ID":"c851677f-703c-404c-801c-064cc6bf3979","Type":"ContainerStarted","Data":"3500daf72b80b03301eaa0dcbba09ffb87b9e843a02d241b91b303efcf884998"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.814050 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tnw27" event={"ID":"08b964cf-bfc5-4b90-83a3-0b358c3ffbc9","Type":"ContainerStarted","Data":"87ff1b2ebb4914fb3d5293578eb6aa3548e770b06889ecab84b124d550e03bd7"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.815384 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.815449 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.819498 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.821100 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.321064363 +0000 UTC m=+261.895809383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.826700 4778 generic.go:334] "Generic (PLEG): container finished" podID="97ee6937-a1a5-42ea-a460-29d54478e633" containerID="f1aaa8a2c1f96baaee4b7353f353a9b567635ea9eb73df19ffa50153f00a757d" exitCode=0 Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.826799 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" event={"ID":"97ee6937-a1a5-42ea-a460-29d54478e633","Type":"ContainerDied","Data":"f1aaa8a2c1f96baaee4b7353f353a9b567635ea9eb73df19ffa50153f00a757d"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.837539 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.853799 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" event={"ID":"533b6e54-efa5-4032-bebd-eedc39a834b8","Type":"ContainerStarted","Data":"de648b1451437bfd2cf21099e5ef5411d37ad2f6b7961bc2fa181e102884c69e"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.875288 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" event={"ID":"c0f3c490-ee49-4a88-893e-132592dd6d59","Type":"ContainerStarted","Data":"9e81a63a52fef7b87060ca33b437dfff7b2ec9de44df0d6374004cb744d97639"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.902310 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" event={"ID":"650a32b4-d961-4805-8521-f1f24de6ad4a","Type":"ContainerStarted","Data":"30ddffa2e12edfbd4af225fd9e9418503669f077fddea1f2d41d174cad195c40"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.921885 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.922007 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.922652 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.924604 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.424575156 +0000 UTC m=+261.999319996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.928617 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.937678 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" podStartSLOduration=189.937657918 podStartE2EDuration="3m9.937657918s" podCreationTimestamp="2026-03-18 09:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:34.929502638 +0000 UTC m=+261.504247498" watchObservedRunningTime="2026-03-18 09:06:34.937657918 +0000 UTC m=+261.512402758" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.976598 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" podStartSLOduration=190.976577659 podStartE2EDuration="3m10.976577659s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:34.970705571 +0000 UTC m=+261.545450421" watchObservedRunningTime="2026-03-18 09:06:34.976577659 +0000 UTC m=+261.551322499" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.009952 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" podStartSLOduration=191.009925459 podStartE2EDuration="3m11.009925459s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.007185755 +0000 UTC m=+261.581930595" watchObservedRunningTime="2026-03-18 09:06:35.009925459 +0000 UTC m=+261.584670299" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.025100 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.027071 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.527052181 +0000 UTC m=+262.101797021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.106952 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" podStartSLOduration=191.106933627 podStartE2EDuration="3m11.106933627s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.055479428 +0000 UTC m=+261.630224268" watchObservedRunningTime="2026-03-18 09:06:35.106933627 +0000 UTC m=+261.681678467" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.130810 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.131356 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.631332425 +0000 UTC m=+262.206077265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.194032 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" podStartSLOduration=191.194014888 podStartE2EDuration="3m11.194014888s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.141667354 +0000 UTC m=+261.716412204" watchObservedRunningTime="2026-03-18 09:06:35.194014888 +0000 UTC m=+261.768759718" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.221954 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jmmm2" podStartSLOduration=8.221936079 podStartE2EDuration="8.221936079s" podCreationTimestamp="2026-03-18 09:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.193514114 +0000 UTC m=+261.768258954" watchObservedRunningTime="2026-03-18 09:06:35.221936079 +0000 UTC m=+261.796680919" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.224724 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" podStartSLOduration=191.224711488 podStartE2EDuration="3m11.224711488s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.22374662 +0000 UTC m=+261.798491490" watchObservedRunningTime="2026-03-18 09:06:35.224711488 +0000 UTC m=+261.799456328" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.237022 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.237543 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.737528781 +0000 UTC m=+262.312273621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.265810 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:35 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:35 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:35 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.266219 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.268671 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" podStartSLOduration=191.268647613 podStartE2EDuration="3m11.268647613s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.26640881 +0000 UTC m=+261.841153670" watchObservedRunningTime="2026-03-18 09:06:35.268647613 +0000 UTC m=+261.843392453" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.325496 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" podStartSLOduration=191.325474504 podStartE2EDuration="3m11.325474504s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.322650764 +0000 UTC m=+261.897395614" watchObservedRunningTime="2026-03-18 09:06:35.325474504 +0000 UTC m=+261.900219344" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.338660 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.339135 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.839117021 +0000 UTC m=+262.413861861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.349643 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51774: no serving certificate available for the kubelet" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.379444 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wgbcp" podStartSLOduration=8.379419842 podStartE2EDuration="8.379419842s" podCreationTimestamp="2026-03-18 09:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.364678805 +0000 UTC m=+261.939423655" watchObservedRunningTime="2026-03-18 09:06:35.379419842 +0000 UTC m=+261.954164682" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.440762 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.441160 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.941144422 +0000 UTC m=+262.515889262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.465626 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" podStartSLOduration=191.465594465 podStartE2EDuration="3m11.465594465s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.396540847 +0000 UTC m=+261.971285707" watchObservedRunningTime="2026-03-18 09:06:35.465594465 +0000 UTC m=+262.040339305" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.468352 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" podStartSLOduration=191.468339223 podStartE2EDuration="3m11.468339223s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.464151863 +0000 UTC m=+262.038896723" watchObservedRunningTime="2026-03-18 09:06:35.468339223 +0000 UTC m=+262.043084063" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.549066 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.549525 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.049502023 +0000 UTC m=+262.624246863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.629109 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" podStartSLOduration=191.629085658 podStartE2EDuration="3m11.629085658s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.55012759 +0000 UTC m=+262.124872450" watchObservedRunningTime="2026-03-18 09:06:35.629085658 +0000 UTC m=+262.203830498" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.650761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.651182 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.151162184 +0000 UTC m=+262.725907114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.752252 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.752699 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.252678231 +0000 UTC m=+262.827423071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.785247 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" podStartSLOduration=191.785226603 podStartE2EDuration="3m11.785226603s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.741538565 +0000 UTC m=+262.316283415" watchObservedRunningTime="2026-03-18 09:06:35.785226603 +0000 UTC m=+262.359971453" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.848958 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" podStartSLOduration=191.848937399 podStartE2EDuration="3m11.848937399s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.846344356 +0000 UTC m=+262.421089216" watchObservedRunningTime="2026-03-18 09:06:35.848937399 +0000 UTC m=+262.423682239" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.854083 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.854510 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.354497857 +0000 UTC m=+262.929242697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.903040 4778 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5vjr4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.903095 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" podUID="6d6ab3a6-da16-4fc8-9235-2c223661de30" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.951271 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" podStartSLOduration=191.951251248 podStartE2EDuration="3m11.951251248s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.949607382 +0000 UTC m=+262.524352212" watchObservedRunningTime="2026-03-18 09:06:35.951251248 +0000 UTC m=+262.525996098" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.960484 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.960876 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.460859501 +0000 UTC m=+263.035604341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.961740 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" event={"ID":"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1","Type":"ContainerStarted","Data":"1586c0f8095bd4b0778e1eaa506a94ee357b4e41e6cce14f56aeb6979aa14e9e"} Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.964073 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" event={"ID":"bead92a8-42de-4171-9c0c-790d64a6d14a","Type":"ContainerStarted","Data":"a73e7a24ecc52d2a54699601ef8af95aa314f9c148ba5b1bfe507fec80d5d712"} Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.994046 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgbcp" event={"ID":"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a","Type":"ContainerStarted","Data":"0f4f22aa2e3ff30f359fe0bdbfef9cbdb9bcb5f527d4e861914eba630b255cfa"} Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.013450 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" event={"ID":"6e93d5ac-22fb-4d53-86c4-3262993f2116","Type":"ContainerStarted","Data":"6b1790a943bd13c2e8ea1bfdef6a3662c2a1ab842f958c1845c6e6743882c61a"} Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.017525 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.036138 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.052995 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.056564 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.058397 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.059573 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.062810 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.077883 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.577859496 +0000 UTC m=+263.152604336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.081607 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" podStartSLOduration=192.081569242 podStartE2EDuration="3m12.081569242s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:36.040612891 +0000 UTC m=+262.615357731" watchObservedRunningTime="2026-03-18 09:06:36.081569242 +0000 UTC m=+262.656314072" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.158799 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" podStartSLOduration=192.15877778 podStartE2EDuration="3m12.15877778s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:36.108474224 +0000 UTC m=+262.683219074" watchObservedRunningTime="2026-03-18 09:06:36.15877778 +0000 UTC m=+262.733522620" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.171641 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.173769 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.673742115 +0000 UTC m=+263.248486965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.235889 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.268497 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:36 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:36 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:36 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.268563 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.273767 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.274533 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.774521691 +0000 UTC m=+263.349266531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.381543 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.382577 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.882556242 +0000 UTC m=+263.457301082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.473453 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" podStartSLOduration=192.473434948 podStartE2EDuration="3m12.473434948s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:36.365762516 +0000 UTC m=+262.940507356" watchObservedRunningTime="2026-03-18 09:06:36.473434948 +0000 UTC m=+263.048179788" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.486706 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.487331 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.987320251 +0000 UTC m=+263.562065091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.491278 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" podStartSLOduration=192.491257133 podStartE2EDuration="3m12.491257133s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:36.485769457 +0000 UTC m=+263.060514307" watchObservedRunningTime="2026-03-18 09:06:36.491257133 +0000 UTC m=+263.066001973" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.494449 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvvlz"] Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.587803 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.588262 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.088243472 +0000 UTC m=+263.662988312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.601153 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz"] Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.601352 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" podUID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" containerName="route-controller-manager" containerID="cri-o://e9257ea8202297fa6d8a75a0ad4f78bf4d5ee04b58490bb83b63980f5daf0cc8" gracePeriod=30 Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.672909 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.690225 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.690649 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.190632363 +0000 UTC m=+263.765377213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.774764 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51778: no serving certificate available for the kubelet" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.790734 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9n22\" (UniqueName: \"kubernetes.io/projected/97ee6937-a1a5-42ea-a460-29d54478e633-kube-api-access-h9n22\") pod \"97ee6937-a1a5-42ea-a460-29d54478e633\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.790832 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ee6937-a1a5-42ea-a460-29d54478e633-secret-volume\") pod \"97ee6937-a1a5-42ea-a460-29d54478e633\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.790900 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume\") pod \"97ee6937-a1a5-42ea-a460-29d54478e633\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.791075 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.791429 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.291411759 +0000 UTC m=+263.866156589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.792154 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume" (OuterVolumeSpecName: "config-volume") pod "97ee6937-a1a5-42ea-a460-29d54478e633" (UID: "97ee6937-a1a5-42ea-a460-29d54478e633"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.815911 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ee6937-a1a5-42ea-a460-29d54478e633-kube-api-access-h9n22" (OuterVolumeSpecName: "kube-api-access-h9n22") pod "97ee6937-a1a5-42ea-a460-29d54478e633" (UID: "97ee6937-a1a5-42ea-a460-29d54478e633"). InnerVolumeSpecName "kube-api-access-h9n22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.817133 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ee6937-a1a5-42ea-a460-29d54478e633-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "97ee6937-a1a5-42ea-a460-29d54478e633" (UID: "97ee6937-a1a5-42ea-a460-29d54478e633"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.893311 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.893681 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.893697 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9n22\" (UniqueName: \"kubernetes.io/projected/97ee6937-a1a5-42ea-a460-29d54478e633-kube-api-access-h9n22\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.893710 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ee6937-a1a5-42ea-a460-29d54478e633-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.893984 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.393972516 +0000 UTC m=+263.968717356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.894485 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.977021 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tbbtb"] Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.977242 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ee6937-a1a5-42ea-a460-29d54478e633" containerName="collect-profiles" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.977255 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ee6937-a1a5-42ea-a460-29d54478e633" containerName="collect-profiles" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.977344 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ee6937-a1a5-42ea-a460-29d54478e633" containerName="collect-profiles" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.977950 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.992244 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.995949 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.996325 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.496304996 +0000 UTC m=+264.071049836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.021429 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbbtb"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.027995 4778 generic.go:334] "Generic (PLEG): container finished" podID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" containerID="e9257ea8202297fa6d8a75a0ad4f78bf4d5ee04b58490bb83b63980f5daf0cc8" exitCode=0 Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.028058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" event={"ID":"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637","Type":"ContainerDied","Data":"e9257ea8202297fa6d8a75a0ad4f78bf4d5ee04b58490bb83b63980f5daf0cc8"} Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.035691 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.037561 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" event={"ID":"97ee6937-a1a5-42ea-a460-29d54478e633","Type":"ContainerDied","Data":"ea27939aa8b795cbb05c9ef86fd0c0cedd8519701e620ecc91f86b4b95a08fc2"} Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.037609 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea27939aa8b795cbb05c9ef86fd0c0cedd8519701e620ecc91f86b4b95a08fc2" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.062897 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" podUID="496a64ab-b670-4201-9238-d60415ccba17" containerName="controller-manager" containerID="cri-o://f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0" gracePeriod=30 Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.106754 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.606735576 +0000 UTC m=+264.181480416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.109281 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.109755 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-utilities\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.110014 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6c42\" (UniqueName: \"kubernetes.io/projected/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-kube-api-access-p6c42\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.117694 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-catalog-content\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.145008 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qvn4w"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.146172 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.149460 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.166671 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qvn4w"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.220751 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221022 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-catalog-content\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221069 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-utilities\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221115 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6c42\" (UniqueName: \"kubernetes.io/projected/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-kube-api-access-p6c42\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221139 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjz8f\" (UniqueName: \"kubernetes.io/projected/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-kube-api-access-vjz8f\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221158 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-catalog-content\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221177 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-utilities\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.221335 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.721316303 +0000 UTC m=+264.296061143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221710 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-catalog-content\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221915 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-utilities\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.266761 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6c42\" (UniqueName: \"kubernetes.io/projected/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-kube-api-access-p6c42\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.269342 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:37 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:37 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:37 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.269396 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.290818 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lzrtd"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.291870 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.296467 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzrtd"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.308557 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.343063 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.343374 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjz8f\" (UniqueName: \"kubernetes.io/projected/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-kube-api-access-vjz8f\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.343393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-catalog-content\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.343425 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-utilities\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.343821 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-utilities\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.344092 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.844075552 +0000 UTC m=+264.418820392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.355871 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-catalog-content\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.384693 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjz8f\" (UniqueName: \"kubernetes.io/projected/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-kube-api-access-vjz8f\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.448911 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.449126 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-utilities\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.449172 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfhg5\" (UniqueName: \"kubernetes.io/projected/3618fc0f-e8b2-4476-a24d-662165a04ecc-kube-api-access-nfhg5\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.449210 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-catalog-content\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.449377 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.949360546 +0000 UTC m=+264.524105376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.485647 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zwknx"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.492812 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.492891 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.501011 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zwknx"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.540664 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.551134 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfhg5\" (UniqueName: \"kubernetes.io/projected/3618fc0f-e8b2-4476-a24d-662165a04ecc-kube-api-access-nfhg5\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.551179 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-catalog-content\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.551306 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.551337 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-utilities\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.551736 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-utilities\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.552423 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-catalog-content\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.552637 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.052618813 +0000 UTC m=+264.627363643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.582146 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfhg5\" (UniqueName: \"kubernetes.io/projected/3618fc0f-e8b2-4476-a24d-662165a04ecc-kube-api-access-nfhg5\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.660788 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc9zp\" (UniqueName: \"kubernetes.io/projected/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-kube-api-access-hc9zp\") pod \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.660897 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-config\") pod \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.660930 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-client-ca\") pod \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.660960 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-serving-cert\") pod \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.661097 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.661289 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-utilities\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.661401 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mff6k\" (UniqueName: \"kubernetes.io/projected/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-kube-api-access-mff6k\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.661445 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-catalog-content\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.661601 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.161579991 +0000 UTC m=+264.736324831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.662544 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-client-ca" (OuterVolumeSpecName: "client-ca") pod "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" (UID: "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.673890 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-kube-api-access-hc9zp" (OuterVolumeSpecName: "kube-api-access-hc9zp") pod "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" (UID: "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637"). InnerVolumeSpecName "kube-api-access-hc9zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.674185 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.680599 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-config" (OuterVolumeSpecName: "config") pod "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" (UID: "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.692450 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" (UID: "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.719250 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.768684 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-config\") pod \"496a64ab-b670-4201-9238-d60415ccba17\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.768767 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-proxy-ca-bundles\") pod \"496a64ab-b670-4201-9238-d60415ccba17\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.768792 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496a64ab-b670-4201-9238-d60415ccba17-serving-cert\") pod \"496a64ab-b670-4201-9238-d60415ccba17\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.768835 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df4jz\" (UniqueName: \"kubernetes.io/projected/496a64ab-b670-4201-9238-d60415ccba17-kube-api-access-df4jz\") pod \"496a64ab-b670-4201-9238-d60415ccba17\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.768880 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-client-ca\") pod \"496a64ab-b670-4201-9238-d60415ccba17\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769040 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-catalog-content\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769083 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769119 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-utilities\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769222 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mff6k\" (UniqueName: \"kubernetes.io/projected/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-kube-api-access-mff6k\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769262 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769273 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769282 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769291 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc9zp\" (UniqueName: \"kubernetes.io/projected/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-kube-api-access-hc9zp\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.770459 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-config" (OuterVolumeSpecName: "config") pod "496a64ab-b670-4201-9238-d60415ccba17" (UID: "496a64ab-b670-4201-9238-d60415ccba17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.770936 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "496a64ab-b670-4201-9238-d60415ccba17" (UID: "496a64ab-b670-4201-9238-d60415ccba17"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.773751 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-catalog-content\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.774128 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.27410925 +0000 UTC m=+264.848854090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.774208 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-client-ca" (OuterVolumeSpecName: "client-ca") pod "496a64ab-b670-4201-9238-d60415ccba17" (UID: "496a64ab-b670-4201-9238-d60415ccba17"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.774490 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-utilities\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.780742 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496a64ab-b670-4201-9238-d60415ccba17-kube-api-access-df4jz" (OuterVolumeSpecName: "kube-api-access-df4jz") pod "496a64ab-b670-4201-9238-d60415ccba17" (UID: "496a64ab-b670-4201-9238-d60415ccba17"). InnerVolumeSpecName "kube-api-access-df4jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.781619 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496a64ab-b670-4201-9238-d60415ccba17-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496a64ab-b670-4201-9238-d60415ccba17" (UID: "496a64ab-b670-4201-9238-d60415ccba17"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.781893 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx"] Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.782122 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" containerName="route-controller-manager" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.782135 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" containerName="route-controller-manager" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.782150 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496a64ab-b670-4201-9238-d60415ccba17" containerName="controller-manager" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.782156 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="496a64ab-b670-4201-9238-d60415ccba17" containerName="controller-manager" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.795764 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" containerName="route-controller-manager" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.795830 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="496a64ab-b670-4201-9238-d60415ccba17" containerName="controller-manager" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.796363 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.812568 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.826829 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mff6k\" (UniqueName: \"kubernetes.io/projected/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-kube-api-access-mff6k\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.867590 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbbtb"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.871829 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d5kq\" (UniqueName: \"kubernetes.io/projected/b379a820-627d-403c-b50b-b6fbea94aa65-kube-api-access-6d5kq\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872139 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-config\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872159 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b379a820-627d-403c-b50b-b6fbea94aa65-serving-cert\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872296 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-client-ca\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872337 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872353 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872362 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496a64ab-b670-4201-9238-d60415ccba17-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872371 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df4jz\" (UniqueName: \"kubernetes.io/projected/496a64ab-b670-4201-9238-d60415ccba17-kube-api-access-df4jz\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872382 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.872460 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.372443777 +0000 UTC m=+264.947188617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: W0318 09:06:37.932513 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4cfa2f4_0114_46ae_a89f_3b2eac3ea0fa.slice/crio-a612db4db085f0867c6a8718b7e590183f11921029875dedeb88f07468c98457 WatchSource:0}: Error finding container a612db4db085f0867c6a8718b7e590183f11921029875dedeb88f07468c98457: Status 404 returned error can't find the container with id a612db4db085f0867c6a8718b7e590183f11921029875dedeb88f07468c98457 Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.940819 4778 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.972916 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-client-ca\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.972968 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d5kq\" (UniqueName: \"kubernetes.io/projected/b379a820-627d-403c-b50b-b6fbea94aa65-kube-api-access-6d5kq\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.972993 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-config\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.973014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b379a820-627d-403c-b50b-b6fbea94aa65-serving-cert\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.973061 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.973342 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.473330796 +0000 UTC m=+265.048075636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.974750 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-client-ca\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.988913 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b379a820-627d-403c-b50b-b6fbea94aa65-serving-cert\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.989158 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-config\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.991469 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d5kq\" (UniqueName: \"kubernetes.io/projected/b379a820-627d-403c-b50b-b6fbea94aa65-kube-api-access-6d5kq\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.028267 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzrtd"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.055698 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qvn4w"] Mar 18 09:06:38 crc kubenswrapper[4778]: W0318 09:06:38.058669 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3618fc0f_e8b2_4476_a24d_662165a04ecc.slice/crio-f49ddb65c41adfb45d65de1198b0f82574583119a6d4929d1ffa55ce9f770dcc WatchSource:0}: Error finding container f49ddb65c41adfb45d65de1198b0f82574583119a6d4929d1ffa55ce9f770dcc: Status 404 returned error can't find the container with id f49ddb65c41adfb45d65de1198b0f82574583119a6d4929d1ffa55ce9f770dcc Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.064489 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerStarted","Data":"a612db4db085f0867c6a8718b7e590183f11921029875dedeb88f07468c98457"} Mar 18 09:06:38 crc kubenswrapper[4778]: W0318 09:06:38.067788 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded8eaf37_d7fe_43d1_8d20_fffdd71748cc.slice/crio-5c2b4b3fdbc29641b0fd4b628d894c39c32fe66c2451fed6064a04a8d6f0eddd WatchSource:0}: Error finding container 5c2b4b3fdbc29641b0fd4b628d894c39c32fe66c2451fed6064a04a8d6f0eddd: Status 404 returned error can't find the container with id 5c2b4b3fdbc29641b0fd4b628d894c39c32fe66c2451fed6064a04a8d6f0eddd Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.069805 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" event={"ID":"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637","Type":"ContainerDied","Data":"d169670b40d53da31764b4cc4e5df28c3ae9d827ab8db481e9ff9be957e4d6a9"} Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.069849 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.069878 4778 scope.go:117] "RemoveContainer" containerID="e9257ea8202297fa6d8a75a0ad4f78bf4d5ee04b58490bb83b63980f5daf0cc8" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.073703 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.073976 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.573944147 +0000 UTC m=+265.148688987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.074047 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.075234 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.575144911 +0000 UTC m=+265.149889741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.077666 4778 generic.go:334] "Generic (PLEG): container finished" podID="496a64ab-b670-4201-9238-d60415ccba17" containerID="f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0" exitCode=0 Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.077857 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.077863 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" event={"ID":"496a64ab-b670-4201-9238-d60415ccba17","Type":"ContainerDied","Data":"f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0"} Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.077933 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" event={"ID":"496a64ab-b670-4201-9238-d60415ccba17","Type":"ContainerDied","Data":"eca604f85fcfe59c73e6a0d9a12120a2d10108fa64f5bc3107b7c718f96ed398"} Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.091500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" event={"ID":"bead92a8-42de-4171-9c0c-790d64a6d14a","Type":"ContainerStarted","Data":"12ed8de0e78c44d58d65f6ebd30bafe5fcfe8674978d824ea5bd048d38599130"} Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.116462 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.121537 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.124000 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.136069 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvvlz"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.136643 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvvlz"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.146177 4778 scope.go:117] "RemoveContainer" containerID="f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.148382 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.175787 4778 scope.go:117] "RemoveContainer" containerID="f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.175869 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.176090 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.676064511 +0000 UTC m=+265.250809351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.176250 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.177209 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0\": container with ID starting with f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0 not found: ID does not exist" containerID="f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0" Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.177269 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.677256005 +0000 UTC m=+265.252000845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.177318 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0"} err="failed to get container status \"f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0\": rpc error: code = NotFound desc = could not find container \"f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0\": container with ID starting with f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0 not found: ID does not exist" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.218314 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496a64ab-b670-4201-9238-d60415ccba17" path="/var/lib/kubelet/pods/496a64ab-b670-4201-9238-d60415ccba17/volumes" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.218835 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" path="/var/lib/kubelet/pods/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637/volumes" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.257255 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:38 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:38 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:38 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.257309 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.278378 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.278574 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.778544566 +0000 UTC m=+265.353289406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.278877 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.279324 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.779306037 +0000 UTC m=+265.354050877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.379924 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.380164 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.880118394 +0000 UTC m=+265.454863224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.381040 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.381495 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.881485004 +0000 UTC m=+265.456229844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.438739 4778 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.438775 4778 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.433038 4778 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T09:06:37.940856685Z","Handler":null,"Name":""} Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.445141 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zwknx"] Mar 18 09:06:38 crc kubenswrapper[4778]: W0318 09:06:38.461955 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b7607d_fa16_45c1_a0cb_c5ec39a288fb.slice/crio-fd97a190415e4d1219ea6675fa83baba068c5459e6a2c0b39450f9d62ba9f267 WatchSource:0}: Error finding container fd97a190415e4d1219ea6675fa83baba068c5459e6a2c0b39450f9d62ba9f267: Status 404 returned error can't find the container with id fd97a190415e4d1219ea6675fa83baba068c5459e6a2c0b39450f9d62ba9f267 Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.481131 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.481779 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.488671 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.583627 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.586270 4778 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.586312 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.608803 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.763793 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8bc989ddd-wh99s"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.765275 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.770425 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.770958 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.771082 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.771275 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.772656 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.774668 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.779274 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8bc989ddd-wh99s"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.780034 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.789899 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2110837-0c54-448f-8b94-68bdea470d14-serving-cert\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.789954 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-config\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.789986 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb5z6\" (UniqueName: \"kubernetes.io/projected/c2110837-0c54-448f-8b94-68bdea470d14-kube-api-access-xb5z6\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.790097 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-client-ca\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.790131 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-proxy-ca-bundles\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.882334 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.892251 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-client-ca\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.892327 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-proxy-ca-bundles\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.892375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2110837-0c54-448f-8b94-68bdea470d14-serving-cert\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.892409 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-config\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.892489 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb5z6\" (UniqueName: \"kubernetes.io/projected/c2110837-0c54-448f-8b94-68bdea470d14-kube-api-access-xb5z6\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.894815 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-client-ca\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.899111 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-proxy-ca-bundles\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.900363 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-config\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.920756 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2110837-0c54-448f-8b94-68bdea470d14-serving-cert\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.933465 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb5z6\" (UniqueName: \"kubernetes.io/projected/c2110837-0c54-448f-8b94-68bdea470d14-kube-api-access-xb5z6\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.074996 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6qgm2"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.077118 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.081217 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.107937 4778 generic.go:334] "Generic (PLEG): container finished" podID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerID="e143a776ed51bb64025b24b3e1cc128e2a2ca67730b9a34f438ed6857f8be065" exitCode=0 Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.108048 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerDied","Data":"e143a776ed51bb64025b24b3e1cc128e2a2ca67730b9a34f438ed6857f8be065"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.116143 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" event={"ID":"b379a820-627d-403c-b50b-b6fbea94aa65","Type":"ContainerStarted","Data":"c52ef847baaab8c966dfa357e5150a1446e9ecf64dd0cf070d104b1c9769de57"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.116309 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" event={"ID":"b379a820-627d-403c-b50b-b6fbea94aa65","Type":"ContainerStarted","Data":"6e0b29785a4e6e0a0788e6fe250c71b8bb1b8d6f56dea6368383dd453f7f7456"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.116743 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.120874 4778 generic.go:334] "Generic (PLEG): container finished" podID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerID="9e9b1baa8deb4596f595ec2a830346f2addf7d69c909efa6643ba0c90cdd01c7" exitCode=0 Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.121054 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerDied","Data":"9e9b1baa8deb4596f595ec2a830346f2addf7d69c909efa6643ba0c90cdd01c7"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.121106 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerStarted","Data":"5c2b4b3fdbc29641b0fd4b628d894c39c32fe66c2451fed6064a04a8d6f0eddd"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.133781 4778 generic.go:334] "Generic (PLEG): container finished" podID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerID="33f31c9dc67a1137fd25116299f097d08a1ad1b4a3924bc1eb5ff8d0db0c9727" exitCode=0 Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.133927 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerDied","Data":"33f31c9dc67a1137fd25116299f097d08a1ad1b4a3924bc1eb5ff8d0db0c9727"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.134002 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerStarted","Data":"f49ddb65c41adfb45d65de1198b0f82574583119a6d4929d1ffa55ce9f770dcc"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.142285 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qgm2"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.151319 4778 generic.go:334] "Generic (PLEG): container finished" podID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerID="44d1d0b1ecaf0bd45db18a8ca3c0502c00748ea75b870a51131c12eecf1aa1f8" exitCode=0 Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.151395 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerDied","Data":"44d1d0b1ecaf0bd45db18a8ca3c0502c00748ea75b870a51131c12eecf1aa1f8"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.151423 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerStarted","Data":"fd97a190415e4d1219ea6675fa83baba068c5459e6a2c0b39450f9d62ba9f267"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.159343 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" podStartSLOduration=2.159328988 podStartE2EDuration="2.159328988s" podCreationTimestamp="2026-03-18 09:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:39.157249009 +0000 UTC m=+265.731993879" watchObservedRunningTime="2026-03-18 09:06:39.159328988 +0000 UTC m=+265.734073848" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.162682 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.167841 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" event={"ID":"bead92a8-42de-4171-9c0c-790d64a6d14a","Type":"ContainerStarted","Data":"274b846862188c4e07460e1a5f36c4313d3d9dfe5281291dc27aa0c023580054"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.167902 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" event={"ID":"bead92a8-42de-4171-9c0c-790d64a6d14a","Type":"ContainerStarted","Data":"da66d81f59ca90001e0fc3ec7bd56b07c7dfea1eaff294030a6c458e89f889d6"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.188075 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nkbq4"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.196315 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf8tc\" (UniqueName: \"kubernetes.io/projected/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-kube-api-access-qf8tc\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.196364 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-utilities\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.196479 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-catalog-content\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.206688 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" podStartSLOduration=12.206670899 podStartE2EDuration="12.206670899s" podCreationTimestamp="2026-03-18 09:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:39.200983308 +0000 UTC m=+265.775728138" watchObservedRunningTime="2026-03-18 09:06:39.206670899 +0000 UTC m=+265.781415729" Mar 18 09:06:39 crc kubenswrapper[4778]: W0318 09:06:39.223113 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2faf8fa8_d474_4c7d_8566_8abc58d7d5ad.slice/crio-c662bfec2aa4a3db7809425b04cf847a1727f28e0c070e4d7298a20004e2533f WatchSource:0}: Error finding container c662bfec2aa4a3db7809425b04cf847a1727f28e0c070e4d7298a20004e2533f: Status 404 returned error can't find the container with id c662bfec2aa4a3db7809425b04cf847a1727f28e0c070e4d7298a20004e2533f Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.248072 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.258958 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:39 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:39 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:39 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.259026 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.300150 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf8tc\" (UniqueName: \"kubernetes.io/projected/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-kube-api-access-qf8tc\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.300242 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-utilities\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.300419 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-catalog-content\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.300736 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-utilities\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.300856 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-catalog-content\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.343492 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf8tc\" (UniqueName: \"kubernetes.io/projected/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-kube-api-access-qf8tc\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.375122 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51792: no serving certificate available for the kubelet" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.401695 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.473382 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8bc989ddd-wh99s"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.480713 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rscg9"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.481977 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: W0318 09:06:39.495504 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2110837_0c54_448f_8b94_68bdea470d14.slice/crio-1f4f69aa812ccf6e175392ab8a7bd2a35678566f0ef2ecd7701be29d6dc1d5ce WatchSource:0}: Error finding container 1f4f69aa812ccf6e175392ab8a7bd2a35678566f0ef2ecd7701be29d6dc1d5ce: Status 404 returned error can't find the container with id 1f4f69aa812ccf6e175392ab8a7bd2a35678566f0ef2ecd7701be29d6dc1d5ce Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.497003 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rscg9"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.506792 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnfhp\" (UniqueName: \"kubernetes.io/projected/3aedbf59-d23d-409e-9742-09824ed6ef2a-kube-api-access-gnfhp\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.506881 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-catalog-content\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.506923 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-utilities\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.607903 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-catalog-content\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.607983 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-utilities\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.608033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnfhp\" (UniqueName: \"kubernetes.io/projected/3aedbf59-d23d-409e-9742-09824ed6ef2a-kube-api-access-gnfhp\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.609105 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-catalog-content\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.609354 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-utilities\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.628812 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnfhp\" (UniqueName: \"kubernetes.io/projected/3aedbf59-d23d-409e-9742-09824ed6ef2a-kube-api-access-gnfhp\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.777822 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.778532 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.782615 4778 patch_prober.go:28] interesting pod/console-f9d7485db-pgsqh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.782698 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pgsqh" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.819845 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.855601 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qgm2"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.857606 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.858989 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:39 crc kubenswrapper[4778]: W0318 09:06:39.865293 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57f9c6f6_c20e_4e28_aec4_f0104ddb2b47.slice/crio-a8ae2f2fdacacfb271851ed06f03f49aedc9a4cd93513577d22ad1180d7345ef WatchSource:0}: Error finding container a8ae2f2fdacacfb271851ed06f03f49aedc9a4cd93513577d22ad1180d7345ef: Status 404 returned error can't find the container with id a8ae2f2fdacacfb271851ed06f03f49aedc9a4cd93513577d22ad1180d7345ef Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.872626 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.087495 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6kvnk"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.089045 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.101239 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.103230 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.103266 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.103669 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6kvnk"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.107093 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.107148 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.123610 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp6w4\" (UniqueName: \"kubernetes.io/projected/938982a6-57b0-4870-abed-a98c42196ae6-kube-api-access-fp6w4\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.123703 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-catalog-content\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.123744 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-utilities\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.153043 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rscg9"] Mar 18 09:06:40 crc kubenswrapper[4778]: W0318 09:06:40.172116 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aedbf59_d23d_409e_9742_09824ed6ef2a.slice/crio-535b754da1c090611910ad3734bf58ec9a371b36592f999639d434308dca9ac6 WatchSource:0}: Error finding container 535b754da1c090611910ad3734bf58ec9a371b36592f999639d434308dca9ac6: Status 404 returned error can't find the container with id 535b754da1c090611910ad3734bf58ec9a371b36592f999639d434308dca9ac6 Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.187375 4778 generic.go:334] "Generic (PLEG): container finished" podID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerID="113dc27ffd2ebd355aaf8e22c8a148444f799a56c796af33fbc9fe643673da94" exitCode=0 Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.227409 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-utilities\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.227493 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp6w4\" (UniqueName: \"kubernetes.io/projected/938982a6-57b0-4870-abed-a98c42196ae6-kube-api-access-fp6w4\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.227580 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-catalog-content\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.228175 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-catalog-content\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.228727 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-utilities\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.234701 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.247398 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qgm2" event={"ID":"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47","Type":"ContainerDied","Data":"113dc27ffd2ebd355aaf8e22c8a148444f799a56c796af33fbc9fe643673da94"} Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.247443 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qgm2" event={"ID":"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47","Type":"ContainerStarted","Data":"a8ae2f2fdacacfb271851ed06f03f49aedc9a4cd93513577d22ad1180d7345ef"} Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.247459 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" event={"ID":"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad","Type":"ContainerStarted","Data":"a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b"} Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.247477 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" event={"ID":"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad","Type":"ContainerStarted","Data":"c662bfec2aa4a3db7809425b04cf847a1727f28e0c070e4d7298a20004e2533f"} Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.247494 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.257109 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.263798 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp6w4\" (UniqueName: \"kubernetes.io/projected/938982a6-57b0-4870-abed-a98c42196ae6-kube-api-access-fp6w4\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.291512 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:40 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:40 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:40 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.292036 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.302387 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" event={"ID":"c2110837-0c54-448f-8b94-68bdea470d14","Type":"ContainerStarted","Data":"f7325e2f133bd8a9b84c58690841dede1fe1ff86d10cd0f0298df6312d40064a"} Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.303014 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.303048 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" event={"ID":"c2110837-0c54-448f-8b94-68bdea470d14","Type":"ContainerStarted","Data":"1f4f69aa812ccf6e175392ab8a7bd2a35678566f0ef2ecd7701be29d6dc1d5ce"} Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.312495 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" podStartSLOduration=196.312473399 podStartE2EDuration="3m16.312473399s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:40.301662243 +0000 UTC m=+266.876407083" watchObservedRunningTime="2026-03-18 09:06:40.312473399 +0000 UTC m=+266.887218239" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.330409 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.346507 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.377435 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" podStartSLOduration=3.377412639 podStartE2EDuration="3.377412639s" podCreationTimestamp="2026-03-18 09:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:40.351555717 +0000 UTC m=+266.926300557" watchObservedRunningTime="2026-03-18 09:06:40.377412639 +0000 UTC m=+266.952157479" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.479161 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.515278 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.516553 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.521001 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.538501 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.538854 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.574301 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t74gc"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.575707 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.588743 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t74gc"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.591976 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.592859 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.614182 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.614713 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.616240 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.641971 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-catalog-content\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.642056 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-utilities\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.642083 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7a7245b-196d-4cea-916b-858e30dcc936-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.647801 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlv42\" (UniqueName: \"kubernetes.io/projected/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-kube-api-access-dlv42\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.648093 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7a7245b-196d-4cea-916b-858e30dcc936-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750019 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60b8330d-375e-49a4-948b-0aaad227e09e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750499 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7a7245b-196d-4cea-916b-858e30dcc936-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750532 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60b8330d-375e-49a4-948b-0aaad227e09e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750567 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-catalog-content\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750624 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-utilities\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750664 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7a7245b-196d-4cea-916b-858e30dcc936-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750722 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlv42\" (UniqueName: \"kubernetes.io/projected/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-kube-api-access-dlv42\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.751068 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7a7245b-196d-4cea-916b-858e30dcc936-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.751899 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-catalog-content\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.751988 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-utilities\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.777848 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7a7245b-196d-4cea-916b-858e30dcc936-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.779292 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlv42\" (UniqueName: \"kubernetes.io/projected/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-kube-api-access-dlv42\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.853624 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60b8330d-375e-49a4-948b-0aaad227e09e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.853686 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60b8330d-375e-49a4-948b-0aaad227e09e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.854013 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60b8330d-375e-49a4-948b-0aaad227e09e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.908841 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60b8330d-375e-49a4-948b-0aaad227e09e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.922392 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.974556 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.025468 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.063575 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6kvnk"] Mar 18 09:06:41 crc kubenswrapper[4778]: W0318 09:06:41.211626 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod938982a6_57b0_4870_abed_a98c42196ae6.slice/crio-40f7240695ce3bda0e1e4c2a6151b9a381bd8173f92f88c530036fbcfc0a002f WatchSource:0}: Error finding container 40f7240695ce3bda0e1e4c2a6151b9a381bd8173f92f88c530036fbcfc0a002f: Status 404 returned error can't find the container with id 40f7240695ce3bda0e1e4c2a6151b9a381bd8173f92f88c530036fbcfc0a002f Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.263031 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:41 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:41 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:41 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.263102 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.324007 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerStarted","Data":"40f7240695ce3bda0e1e4c2a6151b9a381bd8173f92f88c530036fbcfc0a002f"} Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.345048 4778 generic.go:334] "Generic (PLEG): container finished" podID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerID="1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58" exitCode=0 Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.346701 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rscg9" event={"ID":"3aedbf59-d23d-409e-9742-09824ed6ef2a","Type":"ContainerDied","Data":"1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58"} Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.346749 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rscg9" event={"ID":"3aedbf59-d23d-409e-9742-09824ed6ef2a","Type":"ContainerStarted","Data":"535b754da1c090611910ad3734bf58ec9a371b36592f999639d434308dca9ac6"} Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.570429 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 09:06:41 crc kubenswrapper[4778]: W0318 09:06:41.688932 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda7a7245b_196d_4cea_916b_858e30dcc936.slice/crio-7ea6f7a6e8b0a8ff5b75d5e6b844aba55abc48251e304d4a97f397a66a9b6cb3 WatchSource:0}: Error finding container 7ea6f7a6e8b0a8ff5b75d5e6b844aba55abc48251e304d4a97f397a66a9b6cb3: Status 404 returned error can't find the container with id 7ea6f7a6e8b0a8ff5b75d5e6b844aba55abc48251e304d4a97f397a66a9b6cb3 Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.726689 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t74gc"] Mar 18 09:06:41 crc kubenswrapper[4778]: W0318 09:06:41.764165 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01828fdf_ef1b_44e3_905b_aec0c6aaa44f.slice/crio-3806b4b94100c7a5155593ac6313aa0ecd333d565872aab11d373225ceac8468 WatchSource:0}: Error finding container 3806b4b94100c7a5155593ac6313aa0ecd333d565872aab11d373225ceac8468: Status 404 returned error can't find the container with id 3806b4b94100c7a5155593ac6313aa0ecd333d565872aab11d373225ceac8468 Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.831587 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.266175 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:42 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:42 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:42 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.266560 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.383027 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"60b8330d-375e-49a4-948b-0aaad227e09e","Type":"ContainerStarted","Data":"5a32dc2a00f35187349fa55da1e46dd7d020a4f5a521c2ae089869b21c0783e4"} Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.426755 4778 generic.go:334] "Generic (PLEG): container finished" podID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerID="041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118" exitCode=0 Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.426894 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerDied","Data":"041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118"} Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.426926 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerStarted","Data":"3806b4b94100c7a5155593ac6313aa0ecd333d565872aab11d373225ceac8468"} Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.481179 4778 generic.go:334] "Generic (PLEG): container finished" podID="938982a6-57b0-4870-abed-a98c42196ae6" containerID="8977456d128ab832e4d2b65a1ebbe275173e48c92b3849c579d8a9cc853d0ce8" exitCode=0 Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.481871 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerDied","Data":"8977456d128ab832e4d2b65a1ebbe275173e48c92b3849c579d8a9cc853d0ce8"} Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.494781 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a7a7245b-196d-4cea-916b-858e30dcc936","Type":"ContainerStarted","Data":"7ea6f7a6e8b0a8ff5b75d5e6b844aba55abc48251e304d4a97f397a66a9b6cb3"} Mar 18 09:06:43 crc kubenswrapper[4778]: I0318 09:06:43.271179 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:43 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:43 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:43 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:43 crc kubenswrapper[4778]: I0318 09:06:43.271248 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:43 crc kubenswrapper[4778]: I0318 09:06:43.515025 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a7a7245b-196d-4cea-916b-858e30dcc936","Type":"ContainerStarted","Data":"92309743aa161b0e0c5404c8814af46d06b54c2a29edec78dc603167720d6d87"} Mar 18 09:06:43 crc kubenswrapper[4778]: I0318 09:06:43.520563 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"60b8330d-375e-49a4-948b-0aaad227e09e","Type":"ContainerStarted","Data":"5dc6403b3a8eff2a47e98ebf150f40bbe6b37ddfccac40d625f7e4f878779c41"} Mar 18 09:06:43 crc kubenswrapper[4778]: I0318 09:06:43.555453 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.555433888 podStartE2EDuration="3.555433888s" podCreationTimestamp="2026-03-18 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:43.538090495 +0000 UTC m=+270.112835355" watchObservedRunningTime="2026-03-18 09:06:43.555433888 +0000 UTC m=+270.130178728" Mar 18 09:06:43 crc kubenswrapper[4778]: I0318 09:06:43.556467 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.556459636 podStartE2EDuration="3.556459636s" podCreationTimestamp="2026-03-18 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:43.55342159 +0000 UTC m=+270.128166430" watchObservedRunningTime="2026-03-18 09:06:43.556459636 +0000 UTC m=+270.131204466" Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.258150 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:44 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:44 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:44 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.258772 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.531097 4778 ???:1] "http: TLS handshake error from 192.168.126.11:54362: no serving certificate available for the kubelet" Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.545067 4778 generic.go:334] "Generic (PLEG): container finished" podID="60b8330d-375e-49a4-948b-0aaad227e09e" containerID="5dc6403b3a8eff2a47e98ebf150f40bbe6b37ddfccac40d625f7e4f878779c41" exitCode=0 Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.545153 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"60b8330d-375e-49a4-948b-0aaad227e09e","Type":"ContainerDied","Data":"5dc6403b3a8eff2a47e98ebf150f40bbe6b37ddfccac40d625f7e4f878779c41"} Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.558963 4778 generic.go:334] "Generic (PLEG): container finished" podID="a7a7245b-196d-4cea-916b-858e30dcc936" containerID="92309743aa161b0e0c5404c8814af46d06b54c2a29edec78dc603167720d6d87" exitCode=0 Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.559054 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a7a7245b-196d-4cea-916b-858e30dcc936","Type":"ContainerDied","Data":"92309743aa161b0e0c5404c8814af46d06b54c2a29edec78dc603167720d6d87"} Mar 18 09:06:45 crc kubenswrapper[4778]: I0318 09:06:45.256102 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:45 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:45 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:45 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:45 crc kubenswrapper[4778]: I0318 09:06:45.256169 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:45 crc kubenswrapper[4778]: I0318 09:06:45.860218 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:46 crc kubenswrapper[4778]: I0318 09:06:46.256122 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:46 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:46 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:46 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:46 crc kubenswrapper[4778]: I0318 09:06:46.256181 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:47 crc kubenswrapper[4778]: I0318 09:06:47.257097 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:47 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:47 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:47 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:47 crc kubenswrapper[4778]: I0318 09:06:47.257151 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:47 crc kubenswrapper[4778]: I0318 09:06:47.451243 4778 ???:1] "http: TLS handshake error from 192.168.126.11:54372: no serving certificate available for the kubelet" Mar 18 09:06:48 crc kubenswrapper[4778]: I0318 09:06:48.255890 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:48 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:48 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:48 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:48 crc kubenswrapper[4778]: I0318 09:06:48.256187 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:49 crc kubenswrapper[4778]: I0318 09:06:49.256390 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:49 crc kubenswrapper[4778]: I0318 09:06:49.260909 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:49 crc kubenswrapper[4778]: I0318 09:06:49.778497 4778 patch_prober.go:28] interesting pod/console-f9d7485db-pgsqh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 18 09:06:49 crc kubenswrapper[4778]: I0318 09:06:49.778587 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pgsqh" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 18 09:06:50 crc kubenswrapper[4778]: I0318 09:06:50.103037 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:50 crc kubenswrapper[4778]: I0318 09:06:50.103607 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:50 crc kubenswrapper[4778]: I0318 09:06:50.103470 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:50 crc kubenswrapper[4778]: I0318 09:06:50.103726 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:54 crc kubenswrapper[4778]: I0318 09:06:54.796846 4778 ???:1] "http: TLS handshake error from 192.168.126.11:58340: no serving certificate available for the kubelet" Mar 18 09:06:55 crc kubenswrapper[4778]: I0318 09:06:55.655992 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8bc989ddd-wh99s"] Mar 18 09:06:55 crc kubenswrapper[4778]: I0318 09:06:55.656256 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" podUID="c2110837-0c54-448f-8b94-68bdea470d14" containerName="controller-manager" containerID="cri-o://f7325e2f133bd8a9b84c58690841dede1fe1ff86d10cd0f0298df6312d40064a" gracePeriod=30 Mar 18 09:06:55 crc kubenswrapper[4778]: I0318 09:06:55.671040 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx"] Mar 18 09:06:55 crc kubenswrapper[4778]: I0318 09:06:55.671355 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" podUID="b379a820-627d-403c-b50b-b6fbea94aa65" containerName="route-controller-manager" containerID="cri-o://c52ef847baaab8c966dfa357e5150a1446e9ecf64dd0cf070d104b1c9769de57" gracePeriod=30 Mar 18 09:06:56 crc kubenswrapper[4778]: I0318 09:06:56.766809 4778 generic.go:334] "Generic (PLEG): container finished" podID="c2110837-0c54-448f-8b94-68bdea470d14" containerID="f7325e2f133bd8a9b84c58690841dede1fe1ff86d10cd0f0298df6312d40064a" exitCode=0 Mar 18 09:06:56 crc kubenswrapper[4778]: I0318 09:06:56.766962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" event={"ID":"c2110837-0c54-448f-8b94-68bdea470d14","Type":"ContainerDied","Data":"f7325e2f133bd8a9b84c58690841dede1fe1ff86d10cd0f0298df6312d40064a"} Mar 18 09:06:58 crc kubenswrapper[4778]: I0318 09:06:58.150269 4778 patch_prober.go:28] interesting pod/route-controller-manager-67677f775c-zxrmx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 18 09:06:58 crc kubenswrapper[4778]: I0318 09:06:58.150385 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" podUID="b379a820-627d-403c-b50b-b6fbea94aa65" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 18 09:06:58 crc kubenswrapper[4778]: I0318 09:06:58.889431 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:59 crc kubenswrapper[4778]: I0318 09:06:59.163727 4778 patch_prober.go:28] interesting pod/controller-manager-8bc989ddd-wh99s container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 18 09:06:59 crc kubenswrapper[4778]: I0318 09:06:59.163796 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" podUID="c2110837-0c54-448f-8b94-68bdea470d14" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 18 09:06:59 crc kubenswrapper[4778]: I0318 09:06:59.790038 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:59 crc kubenswrapper[4778]: I0318 09:06:59.799184 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:07:00 crc kubenswrapper[4778]: I0318 09:07:00.118018 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:07:00 crc kubenswrapper[4778]: I0318 09:07:00.147798 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:07:00 crc kubenswrapper[4778]: I0318 09:07:00.147946 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.044382 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.095524 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7a7245b-196d-4cea-916b-858e30dcc936-kubelet-dir\") pod \"a7a7245b-196d-4cea-916b-858e30dcc936\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.095653 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7a7245b-196d-4cea-916b-858e30dcc936-kube-api-access\") pod \"a7a7245b-196d-4cea-916b-858e30dcc936\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.095773 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7a7245b-196d-4cea-916b-858e30dcc936-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a7a7245b-196d-4cea-916b-858e30dcc936" (UID: "a7a7245b-196d-4cea-916b-858e30dcc936"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.096245 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7a7245b-196d-4cea-916b-858e30dcc936-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.103948 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a7245b-196d-4cea-916b-858e30dcc936-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a7a7245b-196d-4cea-916b-858e30dcc936" (UID: "a7a7245b-196d-4cea-916b-858e30dcc936"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.197266 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7a7245b-196d-4cea-916b-858e30dcc936-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.802331 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.802376 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a7a7245b-196d-4cea-916b-858e30dcc936","Type":"ContainerDied","Data":"7ea6f7a6e8b0a8ff5b75d5e6b844aba55abc48251e304d4a97f397a66a9b6cb3"} Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.802832 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ea6f7a6e8b0a8ff5b75d5e6b844aba55abc48251e304d4a97f397a66a9b6cb3" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.805980 4778 generic.go:334] "Generic (PLEG): container finished" podID="b379a820-627d-403c-b50b-b6fbea94aa65" containerID="c52ef847baaab8c966dfa357e5150a1446e9ecf64dd0cf070d104b1c9769de57" exitCode=0 Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.806028 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" event={"ID":"b379a820-627d-403c-b50b-b6fbea94aa65","Type":"ContainerDied","Data":"c52ef847baaab8c966dfa357e5150a1446e9ecf64dd0cf070d104b1c9769de57"} Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.112608 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.239093 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60b8330d-375e-49a4-948b-0aaad227e09e-kubelet-dir\") pod \"60b8330d-375e-49a4-948b-0aaad227e09e\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.239237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60b8330d-375e-49a4-948b-0aaad227e09e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "60b8330d-375e-49a4-948b-0aaad227e09e" (UID: "60b8330d-375e-49a4-948b-0aaad227e09e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.239249 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60b8330d-375e-49a4-948b-0aaad227e09e-kube-api-access\") pod \"60b8330d-375e-49a4-948b-0aaad227e09e\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.239743 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60b8330d-375e-49a4-948b-0aaad227e09e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.251536 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b8330d-375e-49a4-948b-0aaad227e09e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "60b8330d-375e-49a4-948b-0aaad227e09e" (UID: "60b8330d-375e-49a4-948b-0aaad227e09e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.341816 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60b8330d-375e-49a4-948b-0aaad227e09e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.497450 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.506725 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544038 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb5z6\" (UniqueName: \"kubernetes.io/projected/c2110837-0c54-448f-8b94-68bdea470d14-kube-api-access-xb5z6\") pod \"c2110837-0c54-448f-8b94-68bdea470d14\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544101 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d5kq\" (UniqueName: \"kubernetes.io/projected/b379a820-627d-403c-b50b-b6fbea94aa65-kube-api-access-6d5kq\") pod \"b379a820-627d-403c-b50b-b6fbea94aa65\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544315 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b379a820-627d-403c-b50b-b6fbea94aa65-serving-cert\") pod \"b379a820-627d-403c-b50b-b6fbea94aa65\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544357 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-client-ca\") pod \"b379a820-627d-403c-b50b-b6fbea94aa65\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544402 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2110837-0c54-448f-8b94-68bdea470d14-serving-cert\") pod \"c2110837-0c54-448f-8b94-68bdea470d14\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544428 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-proxy-ca-bundles\") pod \"c2110837-0c54-448f-8b94-68bdea470d14\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544589 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-config\") pod \"c2110837-0c54-448f-8b94-68bdea470d14\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544620 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-config\") pod \"b379a820-627d-403c-b50b-b6fbea94aa65\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544650 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-client-ca\") pod \"c2110837-0c54-448f-8b94-68bdea470d14\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.546948 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-client-ca" (OuterVolumeSpecName: "client-ca") pod "c2110837-0c54-448f-8b94-68bdea470d14" (UID: "c2110837-0c54-448f-8b94-68bdea470d14"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.547188 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-config" (OuterVolumeSpecName: "config") pod "b379a820-627d-403c-b50b-b6fbea94aa65" (UID: "b379a820-627d-403c-b50b-b6fbea94aa65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.547675 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c2110837-0c54-448f-8b94-68bdea470d14" (UID: "c2110837-0c54-448f-8b94-68bdea470d14"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.547700 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-config" (OuterVolumeSpecName: "config") pod "c2110837-0c54-448f-8b94-68bdea470d14" (UID: "c2110837-0c54-448f-8b94-68bdea470d14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.547705 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-client-ca" (OuterVolumeSpecName: "client-ca") pod "b379a820-627d-403c-b50b-b6fbea94aa65" (UID: "b379a820-627d-403c-b50b-b6fbea94aa65"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.549622 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b379a820-627d-403c-b50b-b6fbea94aa65-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b379a820-627d-403c-b50b-b6fbea94aa65" (UID: "b379a820-627d-403c-b50b-b6fbea94aa65"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.549658 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2110837-0c54-448f-8b94-68bdea470d14-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c2110837-0c54-448f-8b94-68bdea470d14" (UID: "c2110837-0c54-448f-8b94-68bdea470d14"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.550788 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b379a820-627d-403c-b50b-b6fbea94aa65-kube-api-access-6d5kq" (OuterVolumeSpecName: "kube-api-access-6d5kq") pod "b379a820-627d-403c-b50b-b6fbea94aa65" (UID: "b379a820-627d-403c-b50b-b6fbea94aa65"). InnerVolumeSpecName "kube-api-access-6d5kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.552365 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2110837-0c54-448f-8b94-68bdea470d14-kube-api-access-xb5z6" (OuterVolumeSpecName: "kube-api-access-xb5z6") pod "c2110837-0c54-448f-8b94-68bdea470d14" (UID: "c2110837-0c54-448f-8b94-68bdea470d14"). InnerVolumeSpecName "kube-api-access-xb5z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.646914 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb5z6\" (UniqueName: \"kubernetes.io/projected/c2110837-0c54-448f-8b94-68bdea470d14-kube-api-access-xb5z6\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.646963 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d5kq\" (UniqueName: \"kubernetes.io/projected/b379a820-627d-403c-b50b-b6fbea94aa65-kube-api-access-6d5kq\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.646978 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b379a820-627d-403c-b50b-b6fbea94aa65-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.646992 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.647008 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2110837-0c54-448f-8b94-68bdea470d14-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.647024 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.647037 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.647050 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.647061 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.826015 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" event={"ID":"c2110837-0c54-448f-8b94-68bdea470d14","Type":"ContainerDied","Data":"1f4f69aa812ccf6e175392ab8a7bd2a35678566f0ef2ecd7701be29d6dc1d5ce"} Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.826073 4778 scope.go:117] "RemoveContainer" containerID="f7325e2f133bd8a9b84c58690841dede1fe1ff86d10cd0f0298df6312d40064a" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.826073 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.827539 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.827531 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" event={"ID":"b379a820-627d-403c-b50b-b6fbea94aa65","Type":"ContainerDied","Data":"6e0b29785a4e6e0a0788e6fe250c71b8bb1b8d6f56dea6368383dd453f7f7456"} Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.829069 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"60b8330d-375e-49a4-948b-0aaad227e09e","Type":"ContainerDied","Data":"5a32dc2a00f35187349fa55da1e46dd7d020a4f5a521c2ae089869b21c0783e4"} Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.829099 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a32dc2a00f35187349fa55da1e46dd7d020a4f5a521c2ae089869b21c0783e4" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.829150 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.864359 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx"] Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.868396 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx"] Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.883224 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8bc989ddd-wh99s"] Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.886471 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8bc989ddd-wh99s"] Mar 18 09:07:05 crc kubenswrapper[4778]: E0318 09:07:05.563955 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 09:07:05 crc kubenswrapper[4778]: E0318 09:07:05.564488 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:07:05 crc kubenswrapper[4778]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 09:07:05 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-55w8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29563746-b66f7_openshift-infra(c3be356e-94af-47db-a182-dd8a57024619): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 09:07:05 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:07:05 crc kubenswrapper[4778]: E0318 09:07:05.565631 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29563746-b66f7" podUID="c3be356e-94af-47db-a182-dd8a57024619" Mar 18 09:07:05 crc kubenswrapper[4778]: E0318 09:07:05.835707 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29563746-b66f7" podUID="c3be356e-94af-47db-a182-dd8a57024619" Mar 18 09:07:06 crc kubenswrapper[4778]: I0318 09:07:06.195515 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b379a820-627d-403c-b50b-b6fbea94aa65" path="/var/lib/kubelet/pods/b379a820-627d-403c-b50b-b6fbea94aa65/volumes" Mar 18 09:07:06 crc kubenswrapper[4778]: I0318 09:07:06.196150 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2110837-0c54-448f-8b94-68bdea470d14" path="/var/lib/kubelet/pods/c2110837-0c54-448f-8b94-68bdea470d14/volumes" Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.372161 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.373011 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:07:08 crc kubenswrapper[4778]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 09:07:08 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zjwps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29563744-btdt7_openshift-infra(54961f10-93b0-433f-8a7d-b30d69178e9a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 09:07:08 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.374134 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29563744-btdt7" podUID="54961f10-93b0-433f-8a7d-b30d69178e9a" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801337 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56dc87ff66-nlfnr"] Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.801813 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b379a820-627d-403c-b50b-b6fbea94aa65" containerName="route-controller-manager" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801825 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b379a820-627d-403c-b50b-b6fbea94aa65" containerName="route-controller-manager" Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.801847 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b8330d-375e-49a4-948b-0aaad227e09e" containerName="pruner" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801853 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b8330d-375e-49a4-948b-0aaad227e09e" containerName="pruner" Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.801865 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2110837-0c54-448f-8b94-68bdea470d14" containerName="controller-manager" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801872 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2110837-0c54-448f-8b94-68bdea470d14" containerName="controller-manager" Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.801880 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a7245b-196d-4cea-916b-858e30dcc936" containerName="pruner" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801885 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a7245b-196d-4cea-916b-858e30dcc936" containerName="pruner" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801982 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a7245b-196d-4cea-916b-858e30dcc936" containerName="pruner" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801994 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b379a820-627d-403c-b50b-b6fbea94aa65" containerName="route-controller-manager" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.802001 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b8330d-375e-49a4-948b-0aaad227e09e" containerName="pruner" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.802008 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2110837-0c54-448f-8b94-68bdea470d14" containerName="controller-manager" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.802451 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.810469 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.810685 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.810670 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w"] Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.810879 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.812362 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.816544 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-client-ca\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.816602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6d27b9b-6d87-4aa8-abee-5d0323e96304-serving-cert\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.816632 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv4jr\" (UniqueName: \"kubernetes.io/projected/a4345a66-5037-444e-a1e8-c16f21fbdaca-kube-api-access-fv4jr\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.816653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4345a66-5037-444e-a1e8-c16f21fbdaca-serving-cert\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.816692 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-config\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.816718 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rrfs\" (UniqueName: \"kubernetes.io/projected/b6d27b9b-6d87-4aa8-abee-5d0323e96304-kube-api-access-6rrfs\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.817653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-config\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.817699 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-client-ca\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.817775 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-proxy-ca-bundles\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.818266 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.818434 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.818435 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.818458 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.821861 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.822081 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.822327 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.822566 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.823561 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.824366 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.825817 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56dc87ff66-nlfnr"] Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.828596 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w"] Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.856630 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29563744-btdt7" podUID="54961f10-93b0-433f-8a7d-b30d69178e9a" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.918975 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-proxy-ca-bundles\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.919395 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-client-ca\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.919535 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6d27b9b-6d87-4aa8-abee-5d0323e96304-serving-cert\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.919650 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv4jr\" (UniqueName: \"kubernetes.io/projected/a4345a66-5037-444e-a1e8-c16f21fbdaca-kube-api-access-fv4jr\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.919738 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4345a66-5037-444e-a1e8-c16f21fbdaca-serving-cert\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.919857 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-config\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.919942 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rrfs\" (UniqueName: \"kubernetes.io/projected/b6d27b9b-6d87-4aa8-abee-5d0323e96304-kube-api-access-6rrfs\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.920031 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-config\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.920122 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-client-ca\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.921176 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-client-ca\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.921595 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-config\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.922577 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-config\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.922580 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-client-ca\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.929718 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-proxy-ca-bundles\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.930250 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4345a66-5037-444e-a1e8-c16f21fbdaca-serving-cert\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.934701 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6d27b9b-6d87-4aa8-abee-5d0323e96304-serving-cert\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.935826 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rrfs\" (UniqueName: \"kubernetes.io/projected/b6d27b9b-6d87-4aa8-abee-5d0323e96304-kube-api-access-6rrfs\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.936291 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv4jr\" (UniqueName: \"kubernetes.io/projected/a4345a66-5037-444e-a1e8-c16f21fbdaca-kube-api-access-fv4jr\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:09 crc kubenswrapper[4778]: I0318 09:07:09.130108 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:09 crc kubenswrapper[4778]: I0318 09:07:09.144219 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:10 crc kubenswrapper[4778]: I0318 09:07:10.517283 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.633988 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56dc87ff66-nlfnr"] Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.725836 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w"] Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.851279 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.852639 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.856091 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.856279 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.860534 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.937792 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c5110c4-3fed-4837-b17c-6578b2034f13-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.937838 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c5110c4-3fed-4837-b17c-6578b2034f13-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:16 crc kubenswrapper[4778]: I0318 09:07:16.039219 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c5110c4-3fed-4837-b17c-6578b2034f13-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:16 crc kubenswrapper[4778]: I0318 09:07:16.039276 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c5110c4-3fed-4837-b17c-6578b2034f13-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:16 crc kubenswrapper[4778]: I0318 09:07:16.039510 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c5110c4-3fed-4837-b17c-6578b2034f13-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:16 crc kubenswrapper[4778]: I0318 09:07:16.066254 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c5110c4-3fed-4837-b17c-6578b2034f13-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:16 crc kubenswrapper[4778]: I0318 09:07:16.171407 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:17 crc kubenswrapper[4778]: E0318 09:07:17.228943 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 09:07:17 crc kubenswrapper[4778]: E0318 09:07:17.229127 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fp6w4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-6kvnk_openshift-marketplace(938982a6-57b0-4870-abed-a98c42196ae6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 09:07:17 crc kubenswrapper[4778]: E0318 09:07:17.231334 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-6kvnk" podUID="938982a6-57b0-4870-abed-a98c42196ae6" Mar 18 09:07:18 crc kubenswrapper[4778]: E0318 09:07:18.944649 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-6kvnk" podUID="938982a6-57b0-4870-abed-a98c42196ae6" Mar 18 09:07:18 crc kubenswrapper[4778]: I0318 09:07:18.952282 4778 scope.go:117] "RemoveContainer" containerID="c52ef847baaab8c966dfa357e5150a1446e9ecf64dd0cf070d104b1c9769de57" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.045908 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.046103 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gnfhp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rscg9_openshift-marketplace(3aedbf59-d23d-409e-9742-09824ed6ef2a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.047426 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rscg9" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.085620 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.085775 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qf8tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6qgm2_openshift-marketplace(57f9c6f6-c20e-4e28-aec4-f0104ddb2b47): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.087472 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6qgm2" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.160014 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.160255 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlv42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-t74gc_openshift-marketplace(01828fdf-ef1b-44e3-905b-aec0c6aaa44f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.162370 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-t74gc" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.550180 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56dc87ff66-nlfnr"] Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.609449 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 09:07:19 crc kubenswrapper[4778]: W0318 09:07:19.617264 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4c5110c4_3fed_4837_b17c_6578b2034f13.slice/crio-e4ffa2d08ea1bfd2b7fb4d426ddda63210ee2a45b84bf36efb4b50a85751c1c4 WatchSource:0}: Error finding container e4ffa2d08ea1bfd2b7fb4d426ddda63210ee2a45b84bf36efb4b50a85751c1c4: Status 404 returned error can't find the container with id e4ffa2d08ea1bfd2b7fb4d426ddda63210ee2a45b84bf36efb4b50a85751c1c4 Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.626553 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w"] Mar 18 09:07:19 crc kubenswrapper[4778]: W0318 09:07:19.630628 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4345a66_5037_444e_a1e8_c16f21fbdaca.slice/crio-7757ee38844688929de1c12404b0d4e708b8a04f3d6ee943e8d293f97184928f WatchSource:0}: Error finding container 7757ee38844688929de1c12404b0d4e708b8a04f3d6ee943e8d293f97184928f: Status 404 returned error can't find the container with id 7757ee38844688929de1c12404b0d4e708b8a04f3d6ee943e8d293f97184928f Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.919886 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerStarted","Data":"db7b16ab120c184db5ebadc2caf608fa9242b9a332050072ad2cae2fba3722b7"} Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.922051 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerStarted","Data":"8b9319e52264a71946e18a681c32dbd6ffc04e6afcc03a59b8bfa719c7422b7f"} Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.923337 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4c5110c4-3fed-4837-b17c-6578b2034f13","Type":"ContainerStarted","Data":"e4ffa2d08ea1bfd2b7fb4d426ddda63210ee2a45b84bf36efb4b50a85751c1c4"} Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.927879 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerStarted","Data":"206c2187bf0a136c6ff49b69bb1bb6cc918dc7e2a3d9bd6a2d8bac6ce3a51e5f"} Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.929209 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" event={"ID":"a4345a66-5037-444e-a1e8-c16f21fbdaca","Type":"ContainerStarted","Data":"7757ee38844688929de1c12404b0d4e708b8a04f3d6ee943e8d293f97184928f"} Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.931911 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerStarted","Data":"080f512015bd3cc96010f06c24c5ff7c172a932d2f9c91b8b4e05d0e6fdb8776"} Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.932999 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" event={"ID":"b6d27b9b-6d87-4aa8-abee-5d0323e96304","Type":"ContainerStarted","Data":"a122f5ee8353a3e56d947f8d425d42ac2f3e6f348d5fc80375524cfbc8e649c9"} Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.934397 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6qgm2" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.934730 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-t74gc" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.935055 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rscg9" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.942578 4778 generic.go:334] "Generic (PLEG): container finished" podID="4c5110c4-3fed-4837-b17c-6578b2034f13" containerID="59e97b61a0f05736ea90d2008ef8d588313cdc9a879cf789dff3d41032af56db" exitCode=0 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.942677 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4c5110c4-3fed-4837-b17c-6578b2034f13","Type":"ContainerDied","Data":"59e97b61a0f05736ea90d2008ef8d588313cdc9a879cf789dff3d41032af56db"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.946546 4778 generic.go:334] "Generic (PLEG): container finished" podID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerID="080f512015bd3cc96010f06c24c5ff7c172a932d2f9c91b8b4e05d0e6fdb8776" exitCode=0 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.946605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerDied","Data":"080f512015bd3cc96010f06c24c5ff7c172a932d2f9c91b8b4e05d0e6fdb8776"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.948480 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" event={"ID":"b6d27b9b-6d87-4aa8-abee-5d0323e96304","Type":"ContainerStarted","Data":"9dc0270e3833a9e7f231ec92032b5254e8de6e37228c467052e88b73ed8dd1ef"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.948752 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" podUID="b6d27b9b-6d87-4aa8-abee-5d0323e96304" containerName="controller-manager" containerID="cri-o://9dc0270e3833a9e7f231ec92032b5254e8de6e37228c467052e88b73ed8dd1ef" gracePeriod=30 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.948953 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.951037 4778 generic.go:334] "Generic (PLEG): container finished" podID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerID="206c2187bf0a136c6ff49b69bb1bb6cc918dc7e2a3d9bd6a2d8bac6ce3a51e5f" exitCode=0 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.951087 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerDied","Data":"206c2187bf0a136c6ff49b69bb1bb6cc918dc7e2a3d9bd6a2d8bac6ce3a51e5f"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.954477 4778 generic.go:334] "Generic (PLEG): container finished" podID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerID="db7b16ab120c184db5ebadc2caf608fa9242b9a332050072ad2cae2fba3722b7" exitCode=0 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.954544 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerDied","Data":"db7b16ab120c184db5ebadc2caf608fa9242b9a332050072ad2cae2fba3722b7"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.957462 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.960059 4778 generic.go:334] "Generic (PLEG): container finished" podID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerID="8b9319e52264a71946e18a681c32dbd6ffc04e6afcc03a59b8bfa719c7422b7f" exitCode=0 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.960142 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerDied","Data":"8b9319e52264a71946e18a681c32dbd6ffc04e6afcc03a59b8bfa719c7422b7f"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.966495 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" event={"ID":"a4345a66-5037-444e-a1e8-c16f21fbdaca","Type":"ContainerStarted","Data":"80caeba679f5d418af7acb193febae34c30e2113a5b3eda92aafc367f3b5c972"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.966677 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" podUID="a4345a66-5037-444e-a1e8-c16f21fbdaca" containerName="route-controller-manager" containerID="cri-o://80caeba679f5d418af7acb193febae34c30e2113a5b3eda92aafc367f3b5c972" gracePeriod=30 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.966923 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.975232 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:21 crc kubenswrapper[4778]: I0318 09:07:21.070957 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" podStartSLOduration=26.070942226 podStartE2EDuration="26.070942226s" podCreationTimestamp="2026-03-18 09:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:21.067958391 +0000 UTC m=+307.642703231" watchObservedRunningTime="2026-03-18 09:07:21.070942226 +0000 UTC m=+307.645687066" Mar 18 09:07:21 crc kubenswrapper[4778]: I0318 09:07:21.088821 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" podStartSLOduration=26.088804542 podStartE2EDuration="26.088804542s" podCreationTimestamp="2026-03-18 09:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:21.086086635 +0000 UTC m=+307.660831485" watchObservedRunningTime="2026-03-18 09:07:21.088804542 +0000 UTC m=+307.663549382" Mar 18 09:07:21 crc kubenswrapper[4778]: I0318 09:07:21.974415 4778 generic.go:334] "Generic (PLEG): container finished" podID="b6d27b9b-6d87-4aa8-abee-5d0323e96304" containerID="9dc0270e3833a9e7f231ec92032b5254e8de6e37228c467052e88b73ed8dd1ef" exitCode=0 Mar 18 09:07:21 crc kubenswrapper[4778]: I0318 09:07:21.974527 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" event={"ID":"b6d27b9b-6d87-4aa8-abee-5d0323e96304","Type":"ContainerDied","Data":"9dc0270e3833a9e7f231ec92032b5254e8de6e37228c467052e88b73ed8dd1ef"} Mar 18 09:07:21 crc kubenswrapper[4778]: I0318 09:07:21.976905 4778 generic.go:334] "Generic (PLEG): container finished" podID="a4345a66-5037-444e-a1e8-c16f21fbdaca" containerID="80caeba679f5d418af7acb193febae34c30e2113a5b3eda92aafc367f3b5c972" exitCode=0 Mar 18 09:07:21 crc kubenswrapper[4778]: I0318 09:07:21.977021 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" event={"ID":"a4345a66-5037-444e-a1e8-c16f21fbdaca","Type":"ContainerDied","Data":"80caeba679f5d418af7acb193febae34c30e2113a5b3eda92aafc367f3b5c972"} Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.240789 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.333928 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c5110c4-3fed-4837-b17c-6578b2034f13-kubelet-dir\") pod \"4c5110c4-3fed-4837-b17c-6578b2034f13\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.334448 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c5110c4-3fed-4837-b17c-6578b2034f13-kube-api-access\") pod \"4c5110c4-3fed-4837-b17c-6578b2034f13\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.334048 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c5110c4-3fed-4837-b17c-6578b2034f13-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4c5110c4-3fed-4837-b17c-6578b2034f13" (UID: "4c5110c4-3fed-4837-b17c-6578b2034f13"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.334765 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c5110c4-3fed-4837-b17c-6578b2034f13-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.342282 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5110c4-3fed-4837-b17c-6578b2034f13-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4c5110c4-3fed-4837-b17c-6578b2034f13" (UID: "4c5110c4-3fed-4837-b17c-6578b2034f13"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.435850 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c5110c4-3fed-4837-b17c-6578b2034f13-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.484147 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.537248 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-config\") pod \"a4345a66-5037-444e-a1e8-c16f21fbdaca\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.537369 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-client-ca\") pod \"a4345a66-5037-444e-a1e8-c16f21fbdaca\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.537417 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4345a66-5037-444e-a1e8-c16f21fbdaca-serving-cert\") pod \"a4345a66-5037-444e-a1e8-c16f21fbdaca\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.537485 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv4jr\" (UniqueName: \"kubernetes.io/projected/a4345a66-5037-444e-a1e8-c16f21fbdaca-kube-api-access-fv4jr\") pod \"a4345a66-5037-444e-a1e8-c16f21fbdaca\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.539884 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-client-ca" (OuterVolumeSpecName: "client-ca") pod "a4345a66-5037-444e-a1e8-c16f21fbdaca" (UID: "a4345a66-5037-444e-a1e8-c16f21fbdaca"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.539921 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-config" (OuterVolumeSpecName: "config") pod "a4345a66-5037-444e-a1e8-c16f21fbdaca" (UID: "a4345a66-5037-444e-a1e8-c16f21fbdaca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.542557 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4345a66-5037-444e-a1e8-c16f21fbdaca-kube-api-access-fv4jr" (OuterVolumeSpecName: "kube-api-access-fv4jr") pod "a4345a66-5037-444e-a1e8-c16f21fbdaca" (UID: "a4345a66-5037-444e-a1e8-c16f21fbdaca"). InnerVolumeSpecName "kube-api-access-fv4jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.548535 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4345a66-5037-444e-a1e8-c16f21fbdaca-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a4345a66-5037-444e-a1e8-c16f21fbdaca" (UID: "a4345a66-5037-444e-a1e8-c16f21fbdaca"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.578276 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.638847 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-client-ca\") pod \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.638977 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rrfs\" (UniqueName: \"kubernetes.io/projected/b6d27b9b-6d87-4aa8-abee-5d0323e96304-kube-api-access-6rrfs\") pod \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639018 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-config\") pod \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639091 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6d27b9b-6d87-4aa8-abee-5d0323e96304-serving-cert\") pod \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639138 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-proxy-ca-bundles\") pod \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639469 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv4jr\" (UniqueName: \"kubernetes.io/projected/a4345a66-5037-444e-a1e8-c16f21fbdaca-kube-api-access-fv4jr\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639493 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639503 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639512 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4345a66-5037-444e-a1e8-c16f21fbdaca-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.640615 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b6d27b9b-6d87-4aa8-abee-5d0323e96304" (UID: "b6d27b9b-6d87-4aa8-abee-5d0323e96304"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.641037 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-client-ca" (OuterVolumeSpecName: "client-ca") pod "b6d27b9b-6d87-4aa8-abee-5d0323e96304" (UID: "b6d27b9b-6d87-4aa8-abee-5d0323e96304"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.642450 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-config" (OuterVolumeSpecName: "config") pod "b6d27b9b-6d87-4aa8-abee-5d0323e96304" (UID: "b6d27b9b-6d87-4aa8-abee-5d0323e96304"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.646335 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6d27b9b-6d87-4aa8-abee-5d0323e96304-kube-api-access-6rrfs" (OuterVolumeSpecName: "kube-api-access-6rrfs") pod "b6d27b9b-6d87-4aa8-abee-5d0323e96304" (UID: "b6d27b9b-6d87-4aa8-abee-5d0323e96304"). InnerVolumeSpecName "kube-api-access-6rrfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.661495 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6d27b9b-6d87-4aa8-abee-5d0323e96304-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b6d27b9b-6d87-4aa8-abee-5d0323e96304" (UID: "b6d27b9b-6d87-4aa8-abee-5d0323e96304"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.740819 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6d27b9b-6d87-4aa8-abee-5d0323e96304-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.740908 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.740921 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.740930 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rrfs\" (UniqueName: \"kubernetes.io/projected/b6d27b9b-6d87-4aa8-abee-5d0323e96304-kube-api-access-6rrfs\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.740965 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.988781 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerStarted","Data":"6d51e3ef30717618dbfd60d2700ddf309051ceaf03fe8bc0561397808fdc4760"} Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.993643 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563744-btdt7" event={"ID":"54961f10-93b0-433f-8a7d-b30d69178e9a","Type":"ContainerStarted","Data":"2aaa498f349b23cf7a4f0fb9da41ba553f76ed88636548c40f4f1cf1a8220b22"} Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.996798 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" event={"ID":"b6d27b9b-6d87-4aa8-abee-5d0323e96304","Type":"ContainerDied","Data":"a122f5ee8353a3e56d947f8d425d42ac2f3e6f348d5fc80375524cfbc8e649c9"} Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.996827 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.996870 4778 scope.go:117] "RemoveContainer" containerID="9dc0270e3833a9e7f231ec92032b5254e8de6e37228c467052e88b73ed8dd1ef" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.002353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerStarted","Data":"afc4a86783668e2ef48e83935f1494b68e42a6641cb083187f72530915bb720f"} Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.007499 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563746-b66f7" event={"ID":"c3be356e-94af-47db-a182-dd8a57024619","Type":"ContainerStarted","Data":"cf4a9ddbe48af9c3f976ba168fa13253c79814734a6e5e0e3ef5fa348e79df80"} Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.025501 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerStarted","Data":"de8929b794bb5d2a7228965e1493c7e14a2590362f85b344f68eb15fc61bd4bf"} Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.031591 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" event={"ID":"a4345a66-5037-444e-a1e8-c16f21fbdaca","Type":"ContainerDied","Data":"7757ee38844688929de1c12404b0d4e708b8a04f3d6ee943e8d293f97184928f"} Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.031734 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.035643 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4c5110c4-3fed-4837-b17c-6578b2034f13","Type":"ContainerDied","Data":"e4ffa2d08ea1bfd2b7fb4d426ddda63210ee2a45b84bf36efb4b50a85751c1c4"} Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.035671 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4ffa2d08ea1bfd2b7fb4d426ddda63210ee2a45b84bf36efb4b50a85751c1c4" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.035716 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.037183 4778 scope.go:117] "RemoveContainer" containerID="80caeba679f5d418af7acb193febae34c30e2113a5b3eda92aafc367f3b5c972" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.041517 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tbbtb" podStartSLOduration=3.451476477 podStartE2EDuration="47.041489973s" podCreationTimestamp="2026-03-18 09:06:36 +0000 UTC" firstStartedPulling="2026-03-18 09:06:39.112018937 +0000 UTC m=+265.686763767" lastFinishedPulling="2026-03-18 09:07:22.702032423 +0000 UTC m=+309.276777263" observedRunningTime="2026-03-18 09:07:23.019634734 +0000 UTC m=+309.594379574" watchObservedRunningTime="2026-03-18 09:07:23.041489973 +0000 UTC m=+309.616234843" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.042429 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563744-btdt7" podStartSLOduration=152.665405137 podStartE2EDuration="3m23.042419119s" podCreationTimestamp="2026-03-18 09:04:00 +0000 UTC" firstStartedPulling="2026-03-18 09:06:32.037264405 +0000 UTC m=+258.612009245" lastFinishedPulling="2026-03-18 09:07:22.414278387 +0000 UTC m=+308.989023227" observedRunningTime="2026-03-18 09:07:23.037476139 +0000 UTC m=+309.612220999" watchObservedRunningTime="2026-03-18 09:07:23.042419119 +0000 UTC m=+309.617163959" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.061378 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563746-b66f7" podStartSLOduration=32.830250131 podStartE2EDuration="1m23.061358116s" podCreationTimestamp="2026-03-18 09:06:00 +0000 UTC" firstStartedPulling="2026-03-18 09:06:32.036186506 +0000 UTC m=+258.610931346" lastFinishedPulling="2026-03-18 09:07:22.267294491 +0000 UTC m=+308.842039331" observedRunningTime="2026-03-18 09:07:23.053358589 +0000 UTC m=+309.628103419" watchObservedRunningTime="2026-03-18 09:07:23.061358116 +0000 UTC m=+309.636102956" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.075696 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lzrtd" podStartSLOduration=2.429990659 podStartE2EDuration="46.075673392s" podCreationTimestamp="2026-03-18 09:06:37 +0000 UTC" firstStartedPulling="2026-03-18 09:06:39.144840117 +0000 UTC m=+265.719584967" lastFinishedPulling="2026-03-18 09:07:22.79052286 +0000 UTC m=+309.365267700" observedRunningTime="2026-03-18 09:07:23.075055225 +0000 UTC m=+309.649800065" watchObservedRunningTime="2026-03-18 09:07:23.075673392 +0000 UTC m=+309.650418232" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.093410 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w"] Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.096942 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w"] Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.167257 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zwknx" podStartSLOduration=2.6196814760000002 podStartE2EDuration="46.167228067s" podCreationTimestamp="2026-03-18 09:06:37 +0000 UTC" firstStartedPulling="2026-03-18 09:06:39.153072221 +0000 UTC m=+265.727817071" lastFinishedPulling="2026-03-18 09:07:22.700618822 +0000 UTC m=+309.275363662" observedRunningTime="2026-03-18 09:07:23.15749652 +0000 UTC m=+309.732241380" watchObservedRunningTime="2026-03-18 09:07:23.167228067 +0000 UTC m=+309.741972917" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.172101 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56dc87ff66-nlfnr"] Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.178390 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56dc87ff66-nlfnr"] Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.221689 4778 csr.go:261] certificate signing request csr-zckbj is approved, waiting to be issued Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.229628 4778 csr.go:257] certificate signing request csr-zckbj is issued Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.047923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerStarted","Data":"70cbb8df67b66047dc2936fa97099c75389be5914cb70774619b80a3c1ca3b41"} Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.050003 4778 generic.go:334] "Generic (PLEG): container finished" podID="c3be356e-94af-47db-a182-dd8a57024619" containerID="cf4a9ddbe48af9c3f976ba168fa13253c79814734a6e5e0e3ef5fa348e79df80" exitCode=0 Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.050122 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563746-b66f7" event={"ID":"c3be356e-94af-47db-a182-dd8a57024619","Type":"ContainerDied","Data":"cf4a9ddbe48af9c3f976ba168fa13253c79814734a6e5e0e3ef5fa348e79df80"} Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.053516 4778 generic.go:334] "Generic (PLEG): container finished" podID="54961f10-93b0-433f-8a7d-b30d69178e9a" containerID="2aaa498f349b23cf7a4f0fb9da41ba553f76ed88636548c40f4f1cf1a8220b22" exitCode=0 Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.053584 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563744-btdt7" event={"ID":"54961f10-93b0-433f-8a7d-b30d69178e9a","Type":"ContainerDied","Data":"2aaa498f349b23cf7a4f0fb9da41ba553f76ed88636548c40f4f1cf1a8220b22"} Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.068805 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qvn4w" podStartSLOduration=3.253544468 podStartE2EDuration="47.068787667s" podCreationTimestamp="2026-03-18 09:06:37 +0000 UTC" firstStartedPulling="2026-03-18 09:06:39.123403949 +0000 UTC m=+265.698148789" lastFinishedPulling="2026-03-18 09:07:22.938647138 +0000 UTC m=+309.513391988" observedRunningTime="2026-03-18 09:07:24.067943333 +0000 UTC m=+310.642688173" watchObservedRunningTime="2026-03-18 09:07:24.068787667 +0000 UTC m=+310.643532507" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.194347 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4345a66-5037-444e-a1e8-c16f21fbdaca" path="/var/lib/kubelet/pods/a4345a66-5037-444e-a1e8-c16f21fbdaca/volumes" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.195010 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6d27b9b-6d87-4aa8-abee-5d0323e96304" path="/var/lib/kubelet/pods/b6d27b9b-6d87-4aa8-abee-5d0323e96304/volumes" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.231995 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-11 15:03:05.394962459 +0000 UTC Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.232090 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7181h55m41.162875563s for next certificate rotation Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448391 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 09:07:24 crc kubenswrapper[4778]: E0318 09:07:24.448653 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5110c4-3fed-4837-b17c-6578b2034f13" containerName="pruner" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448668 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5110c4-3fed-4837-b17c-6578b2034f13" containerName="pruner" Mar 18 09:07:24 crc kubenswrapper[4778]: E0318 09:07:24.448687 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d27b9b-6d87-4aa8-abee-5d0323e96304" containerName="controller-manager" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448696 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d27b9b-6d87-4aa8-abee-5d0323e96304" containerName="controller-manager" Mar 18 09:07:24 crc kubenswrapper[4778]: E0318 09:07:24.448716 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4345a66-5037-444e-a1e8-c16f21fbdaca" containerName="route-controller-manager" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448724 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4345a66-5037-444e-a1e8-c16f21fbdaca" containerName="route-controller-manager" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448850 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c5110c4-3fed-4837-b17c-6578b2034f13" containerName="pruner" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448862 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d27b9b-6d87-4aa8-abee-5d0323e96304" containerName="controller-manager" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448870 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4345a66-5037-444e-a1e8-c16f21fbdaca" containerName="route-controller-manager" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.449307 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.451844 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.454385 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.464402 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.470171 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.470508 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-var-lock\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.470589 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93723e0d-2243-4390-a667-8a080325205f-kube-api-access\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.571370 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-var-lock\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.571436 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93723e0d-2243-4390-a667-8a080325205f-kube-api-access\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.571487 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.571570 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.571623 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-var-lock\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.598980 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93723e0d-2243-4390-a667-8a080325205f-kube-api-access\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.764110 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.811761 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24"] Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.812655 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.820159 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-785666b9f5-xt96c"] Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.822011 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.834564 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.837143 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.837613 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.837805 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.840969 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.841208 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.841359 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.841713 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24"] Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.843548 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.843611 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.844186 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.844399 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.852277 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.862257 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-785666b9f5-xt96c"] Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.862776 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.980307 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-client-ca\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.980874 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/180e4f84-52ed-4db3-aec6-c724becfadf1-serving-cert\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.980949 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-config\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.981076 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txfl6\" (UniqueName: \"kubernetes.io/projected/2cd169c1-e595-4497-856c-3dd27c1cf551-kube-api-access-txfl6\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.981105 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-proxy-ca-bundles\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.981258 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd169c1-e595-4497-856c-3dd27c1cf551-serving-cert\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.981329 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-client-ca\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.981360 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-config\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.982314 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8mw9\" (UniqueName: \"kubernetes.io/projected/180e4f84-52ed-4db3-aec6-c724becfadf1-kube-api-access-c8mw9\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085090 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txfl6\" (UniqueName: \"kubernetes.io/projected/2cd169c1-e595-4497-856c-3dd27c1cf551-kube-api-access-txfl6\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085145 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-proxy-ca-bundles\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd169c1-e595-4497-856c-3dd27c1cf551-serving-cert\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085244 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-client-ca\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085302 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-config\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085328 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8mw9\" (UniqueName: \"kubernetes.io/projected/180e4f84-52ed-4db3-aec6-c724becfadf1-kube-api-access-c8mw9\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085376 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-client-ca\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-config\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085460 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/180e4f84-52ed-4db3-aec6-c724becfadf1-serving-cert\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.086746 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-config\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.087956 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-proxy-ca-bundles\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.090032 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/180e4f84-52ed-4db3-aec6-c724becfadf1-serving-cert\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.091834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-client-ca\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.093351 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-config\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.094228 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-client-ca\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.095028 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd169c1-e595-4497-856c-3dd27c1cf551-serving-cert\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.107482 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txfl6\" (UniqueName: \"kubernetes.io/projected/2cd169c1-e595-4497-856c-3dd27c1cf551-kube-api-access-txfl6\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.110903 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8mw9\" (UniqueName: \"kubernetes.io/projected/180e4f84-52ed-4db3-aec6-c724becfadf1-kube-api-access-c8mw9\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.173483 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.182152 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.237351 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-20 07:23:31.301105393 +0000 UTC Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.237408 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7390h16m6.063700246s for next certificate rotation Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.309075 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.353740 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.441243 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.491418 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjwps\" (UniqueName: \"kubernetes.io/projected/54961f10-93b0-433f-8a7d-b30d69178e9a-kube-api-access-zjwps\") pod \"54961f10-93b0-433f-8a7d-b30d69178e9a\" (UID: \"54961f10-93b0-433f-8a7d-b30d69178e9a\") " Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.491537 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55w8z\" (UniqueName: \"kubernetes.io/projected/c3be356e-94af-47db-a182-dd8a57024619-kube-api-access-55w8z\") pod \"c3be356e-94af-47db-a182-dd8a57024619\" (UID: \"c3be356e-94af-47db-a182-dd8a57024619\") " Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.503798 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3be356e-94af-47db-a182-dd8a57024619-kube-api-access-55w8z" (OuterVolumeSpecName: "kube-api-access-55w8z") pod "c3be356e-94af-47db-a182-dd8a57024619" (UID: "c3be356e-94af-47db-a182-dd8a57024619"). InnerVolumeSpecName "kube-api-access-55w8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.507627 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54961f10-93b0-433f-8a7d-b30d69178e9a-kube-api-access-zjwps" (OuterVolumeSpecName: "kube-api-access-zjwps") pod "54961f10-93b0-433f-8a7d-b30d69178e9a" (UID: "54961f10-93b0-433f-8a7d-b30d69178e9a"). InnerVolumeSpecName "kube-api-access-zjwps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.592373 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjwps\" (UniqueName: \"kubernetes.io/projected/54961f10-93b0-433f-8a7d-b30d69178e9a-kube-api-access-zjwps\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.592412 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55w8z\" (UniqueName: \"kubernetes.io/projected/c3be356e-94af-47db-a182-dd8a57024619-kube-api-access-55w8z\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.595159 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-785666b9f5-xt96c"] Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.735319 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24"] Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.073041 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.073030 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563746-b66f7" event={"ID":"c3be356e-94af-47db-a182-dd8a57024619","Type":"ContainerDied","Data":"4d957b42c20ebb120c0681574b93e0b852f5977f6c96c78d95883a927b1e8844"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.073229 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d957b42c20ebb120c0681574b93e0b852f5977f6c96c78d95883a927b1e8844" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.085261 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"93723e0d-2243-4390-a667-8a080325205f","Type":"ContainerStarted","Data":"e2b50472335d82d0cbf47f1b30666ae7b50519462f89aadf15ef33cf94804236"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.085300 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"93723e0d-2243-4390-a667-8a080325205f","Type":"ContainerStarted","Data":"e6f7bb7ae014c0c468ab5d5f0830ec452ff1c1962710492d6af9f507d40acc53"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.089550 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563744-btdt7" event={"ID":"54961f10-93b0-433f-8a7d-b30d69178e9a","Type":"ContainerDied","Data":"df77c4671fb6dc8dc3716ac3d7733190f2f2696ab30319657174e00cec76ec77"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.089624 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df77c4671fb6dc8dc3716ac3d7733190f2f2696ab30319657174e00cec76ec77" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.089570 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.090967 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" event={"ID":"180e4f84-52ed-4db3-aec6-c724becfadf1","Type":"ContainerStarted","Data":"6182c0f78e6f784c954f3d716e1ee189ef539115eb9b295b92f5ba494516c3c6"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.091042 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" event={"ID":"180e4f84-52ed-4db3-aec6-c724becfadf1","Type":"ContainerStarted","Data":"9451f090d87a54fcdbbfb38d7aba157bfa9ee730f5dff506b43c409b10178b80"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.091744 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.093035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" event={"ID":"2cd169c1-e595-4497-856c-3dd27c1cf551","Type":"ContainerStarted","Data":"218eebc0b353408962cb4ccc7a6dbcb7e94147122bce8e36e5a54e046e5b06f8"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.093063 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" event={"ID":"2cd169c1-e595-4497-856c-3dd27c1cf551","Type":"ContainerStarted","Data":"fd1c635644f48a9fb2e1288c201f66a5f81d58f274081aff739ae76932d747eb"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.093637 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.112425 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.121528 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.121513343 podStartE2EDuration="2.121513343s" podCreationTimestamp="2026-03-18 09:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:26.118095557 +0000 UTC m=+312.692840427" watchObservedRunningTime="2026-03-18 09:07:26.121513343 +0000 UTC m=+312.696258183" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.161250 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" podStartSLOduration=11.161220468 podStartE2EDuration="11.161220468s" podCreationTimestamp="2026-03-18 09:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:26.158430519 +0000 UTC m=+312.733175379" watchObservedRunningTime="2026-03-18 09:07:26.161220468 +0000 UTC m=+312.735965318" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.184562 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" podStartSLOduration=11.18454217 podStartE2EDuration="11.18454217s" podCreationTimestamp="2026-03-18 09:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:26.179987211 +0000 UTC m=+312.754732061" watchObservedRunningTime="2026-03-18 09:07:26.18454217 +0000 UTC m=+312.759287010" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.280387 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.309356 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.309413 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.541700 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.542794 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.675281 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.675427 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.707900 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.708392 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.722526 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.117672 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.117716 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.163379 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.183717 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.185544 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.202704 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.394652 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5dpv"] Mar 18 09:07:29 crc kubenswrapper[4778]: I0318 09:07:29.157839 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.147617 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.147669 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.147717 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.148242 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.148313 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc" gracePeriod=600 Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.161972 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zwknx"] Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.367600 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzrtd"] Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.367935 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lzrtd" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="registry-server" containerID="cri-o://afc4a86783668e2ef48e83935f1494b68e42a6641cb083187f72530915bb720f" gracePeriod=2 Mar 18 09:07:31 crc kubenswrapper[4778]: I0318 09:07:31.128293 4778 generic.go:334] "Generic (PLEG): container finished" podID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerID="afc4a86783668e2ef48e83935f1494b68e42a6641cb083187f72530915bb720f" exitCode=0 Mar 18 09:07:31 crc kubenswrapper[4778]: I0318 09:07:31.128388 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerDied","Data":"afc4a86783668e2ef48e83935f1494b68e42a6641cb083187f72530915bb720f"} Mar 18 09:07:31 crc kubenswrapper[4778]: I0318 09:07:31.131048 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc" exitCode=0 Mar 18 09:07:31 crc kubenswrapper[4778]: I0318 09:07:31.131421 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zwknx" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="registry-server" containerID="cri-o://de8929b794bb5d2a7228965e1493c7e14a2590362f85b344f68eb15fc61bd4bf" gracePeriod=2 Mar 18 09:07:31 crc kubenswrapper[4778]: I0318 09:07:31.131185 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc"} Mar 18 09:07:31 crc kubenswrapper[4778]: I0318 09:07:31.941315 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.018355 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfhg5\" (UniqueName: \"kubernetes.io/projected/3618fc0f-e8b2-4476-a24d-662165a04ecc-kube-api-access-nfhg5\") pod \"3618fc0f-e8b2-4476-a24d-662165a04ecc\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.018738 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-utilities\") pod \"3618fc0f-e8b2-4476-a24d-662165a04ecc\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.018821 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-catalog-content\") pod \"3618fc0f-e8b2-4476-a24d-662165a04ecc\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.019692 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-utilities" (OuterVolumeSpecName: "utilities") pod "3618fc0f-e8b2-4476-a24d-662165a04ecc" (UID: "3618fc0f-e8b2-4476-a24d-662165a04ecc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.024544 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3618fc0f-e8b2-4476-a24d-662165a04ecc-kube-api-access-nfhg5" (OuterVolumeSpecName: "kube-api-access-nfhg5") pod "3618fc0f-e8b2-4476-a24d-662165a04ecc" (UID: "3618fc0f-e8b2-4476-a24d-662165a04ecc"). InnerVolumeSpecName "kube-api-access-nfhg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.093082 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3618fc0f-e8b2-4476-a24d-662165a04ecc" (UID: "3618fc0f-e8b2-4476-a24d-662165a04ecc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.121054 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfhg5\" (UniqueName: \"kubernetes.io/projected/3618fc0f-e8b2-4476-a24d-662165a04ecc-kube-api-access-nfhg5\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.121098 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.121112 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.142307 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.142292 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerDied","Data":"f49ddb65c41adfb45d65de1198b0f82574583119a6d4929d1ffa55ce9f770dcc"} Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.142473 4778 scope.go:117] "RemoveContainer" containerID="afc4a86783668e2ef48e83935f1494b68e42a6641cb083187f72530915bb720f" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.145128 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"34445970a34bfac9ea81483ad63bd182faceb7cf5f4517903b0d318af28a7e07"} Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.150141 4778 generic.go:334] "Generic (PLEG): container finished" podID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerID="de8929b794bb5d2a7228965e1493c7e14a2590362f85b344f68eb15fc61bd4bf" exitCode=0 Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.150207 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerDied","Data":"de8929b794bb5d2a7228965e1493c7e14a2590362f85b344f68eb15fc61bd4bf"} Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.150229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerDied","Data":"fd97a190415e4d1219ea6675fa83baba068c5459e6a2c0b39450f9d62ba9f267"} Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.150241 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd97a190415e4d1219ea6675fa83baba068c5459e6a2c0b39450f9d62ba9f267" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.163972 4778 scope.go:117] "RemoveContainer" containerID="db7b16ab120c184db5ebadc2caf608fa9242b9a332050072ad2cae2fba3722b7" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.169840 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.207860 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzrtd"] Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.207904 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lzrtd"] Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.210792 4778 scope.go:117] "RemoveContainer" containerID="33f31c9dc67a1137fd25116299f097d08a1ad1b4a3924bc1eb5ff8d0db0c9727" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.323357 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mff6k\" (UniqueName: \"kubernetes.io/projected/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-kube-api-access-mff6k\") pod \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.323409 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-catalog-content\") pod \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.323437 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-utilities\") pod \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.324524 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-utilities" (OuterVolumeSpecName: "utilities") pod "23b7607d-fa16-45c1-a0cb-c5ec39a288fb" (UID: "23b7607d-fa16-45c1-a0cb-c5ec39a288fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.326251 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-kube-api-access-mff6k" (OuterVolumeSpecName: "kube-api-access-mff6k") pod "23b7607d-fa16-45c1-a0cb-c5ec39a288fb" (UID: "23b7607d-fa16-45c1-a0cb-c5ec39a288fb"). InnerVolumeSpecName "kube-api-access-mff6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.374932 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23b7607d-fa16-45c1-a0cb-c5ec39a288fb" (UID: "23b7607d-fa16-45c1-a0cb-c5ec39a288fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.425043 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mff6k\" (UniqueName: \"kubernetes.io/projected/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-kube-api-access-mff6k\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.425103 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.425122 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:33 crc kubenswrapper[4778]: I0318 09:07:33.164781 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:07:33 crc kubenswrapper[4778]: I0318 09:07:33.209965 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zwknx"] Mar 18 09:07:33 crc kubenswrapper[4778]: I0318 09:07:33.214644 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zwknx"] Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.172963 4778 generic.go:334] "Generic (PLEG): container finished" podID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerID="dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea" exitCode=0 Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.173384 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rscg9" event={"ID":"3aedbf59-d23d-409e-9742-09824ed6ef2a","Type":"ContainerDied","Data":"dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea"} Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.179276 4778 generic.go:334] "Generic (PLEG): container finished" podID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerID="f49cf5ea04db3604b7012853be48f57eabfbbf2919ff145d883ab1c07e04a460" exitCode=0 Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.179384 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qgm2" event={"ID":"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47","Type":"ContainerDied","Data":"f49cf5ea04db3604b7012853be48f57eabfbbf2919ff145d883ab1c07e04a460"} Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.183544 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerStarted","Data":"fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731"} Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.197566 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" path="/var/lib/kubelet/pods/23b7607d-fa16-45c1-a0cb-c5ec39a288fb/volumes" Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.199446 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" path="/var/lib/kubelet/pods/3618fc0f-e8b2-4476-a24d-662165a04ecc/volumes" Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.200323 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerStarted","Data":"d3a1ccac938944a3f089f8c92f9aebbf40c8042f08e26dece0839acbb161f095"} Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.217554 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rscg9" event={"ID":"3aedbf59-d23d-409e-9742-09824ed6ef2a","Type":"ContainerStarted","Data":"5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be"} Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.220917 4778 generic.go:334] "Generic (PLEG): container finished" podID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerID="fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731" exitCode=0 Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.221022 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerDied","Data":"fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731"} Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.225225 4778 generic.go:334] "Generic (PLEG): container finished" podID="938982a6-57b0-4870-abed-a98c42196ae6" containerID="d3a1ccac938944a3f089f8c92f9aebbf40c8042f08e26dece0839acbb161f095" exitCode=0 Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.225356 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerDied","Data":"d3a1ccac938944a3f089f8c92f9aebbf40c8042f08e26dece0839acbb161f095"} Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.283843 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rscg9" podStartSLOduration=3.024096878 podStartE2EDuration="56.28381566s" podCreationTimestamp="2026-03-18 09:06:39 +0000 UTC" firstStartedPulling="2026-03-18 09:06:41.358542745 +0000 UTC m=+267.933287585" lastFinishedPulling="2026-03-18 09:07:34.618261527 +0000 UTC m=+321.193006367" observedRunningTime="2026-03-18 09:07:35.251520034 +0000 UTC m=+321.826264894" watchObservedRunningTime="2026-03-18 09:07:35.28381566 +0000 UTC m=+321.858560540" Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.681288 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-785666b9f5-xt96c"] Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.682020 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" podUID="2cd169c1-e595-4497-856c-3dd27c1cf551" containerName="controller-manager" containerID="cri-o://218eebc0b353408962cb4ccc7a6dbcb7e94147122bce8e36e5a54e046e5b06f8" gracePeriod=30 Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.718215 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24"] Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.718453 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" podUID="180e4f84-52ed-4db3-aec6-c724becfadf1" containerName="route-controller-manager" containerID="cri-o://6182c0f78e6f784c954f3d716e1ee189ef539115eb9b295b92f5ba494516c3c6" gracePeriod=30 Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.233650 4778 generic.go:334] "Generic (PLEG): container finished" podID="180e4f84-52ed-4db3-aec6-c724becfadf1" containerID="6182c0f78e6f784c954f3d716e1ee189ef539115eb9b295b92f5ba494516c3c6" exitCode=0 Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.233734 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" event={"ID":"180e4f84-52ed-4db3-aec6-c724becfadf1","Type":"ContainerDied","Data":"6182c0f78e6f784c954f3d716e1ee189ef539115eb9b295b92f5ba494516c3c6"} Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.235735 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" event={"ID":"180e4f84-52ed-4db3-aec6-c724becfadf1","Type":"ContainerDied","Data":"9451f090d87a54fcdbbfb38d7aba157bfa9ee730f5dff506b43c409b10178b80"} Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.235823 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9451f090d87a54fcdbbfb38d7aba157bfa9ee730f5dff506b43c409b10178b80" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.237352 4778 generic.go:334] "Generic (PLEG): container finished" podID="2cd169c1-e595-4497-856c-3dd27c1cf551" containerID="218eebc0b353408962cb4ccc7a6dbcb7e94147122bce8e36e5a54e046e5b06f8" exitCode=0 Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.237429 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" event={"ID":"2cd169c1-e595-4497-856c-3dd27c1cf551","Type":"ContainerDied","Data":"218eebc0b353408962cb4ccc7a6dbcb7e94147122bce8e36e5a54e046e5b06f8"} Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.239248 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qgm2" event={"ID":"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47","Type":"ContainerStarted","Data":"35627094815d8aab35b2d89ade00f4db12d2982f44d18065d3e2b7a8cff620a7"} Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.240882 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerStarted","Data":"079c4231aac10f3612c674a07d9729d7a0fd2d54be2669f250d2d9ca937b5439"} Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.263793 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6qgm2" podStartSLOduration=2.338288062 podStartE2EDuration="57.263773043s" podCreationTimestamp="2026-03-18 09:06:39 +0000 UTC" firstStartedPulling="2026-03-18 09:06:40.190029079 +0000 UTC m=+266.764773919" lastFinishedPulling="2026-03-18 09:07:35.11551405 +0000 UTC m=+321.690258900" observedRunningTime="2026-03-18 09:07:36.259746038 +0000 UTC m=+322.834490878" watchObservedRunningTime="2026-03-18 09:07:36.263773043 +0000 UTC m=+322.838517883" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.269292 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.286502 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6kvnk" podStartSLOduration=3.048662096 podStartE2EDuration="56.286465426s" podCreationTimestamp="2026-03-18 09:06:40 +0000 UTC" firstStartedPulling="2026-03-18 09:06:42.488354656 +0000 UTC m=+269.063099496" lastFinishedPulling="2026-03-18 09:07:35.726157986 +0000 UTC m=+322.300902826" observedRunningTime="2026-03-18 09:07:36.284417478 +0000 UTC m=+322.859162318" watchObservedRunningTime="2026-03-18 09:07:36.286465426 +0000 UTC m=+322.861210266" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.413102 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/180e4f84-52ed-4db3-aec6-c724becfadf1-serving-cert\") pod \"180e4f84-52ed-4db3-aec6-c724becfadf1\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.413185 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-client-ca\") pod \"180e4f84-52ed-4db3-aec6-c724becfadf1\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.413358 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8mw9\" (UniqueName: \"kubernetes.io/projected/180e4f84-52ed-4db3-aec6-c724becfadf1-kube-api-access-c8mw9\") pod \"180e4f84-52ed-4db3-aec6-c724becfadf1\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.413393 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-config\") pod \"180e4f84-52ed-4db3-aec6-c724becfadf1\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.414334 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-client-ca" (OuterVolumeSpecName: "client-ca") pod "180e4f84-52ed-4db3-aec6-c724becfadf1" (UID: "180e4f84-52ed-4db3-aec6-c724becfadf1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.414374 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-config" (OuterVolumeSpecName: "config") pod "180e4f84-52ed-4db3-aec6-c724becfadf1" (UID: "180e4f84-52ed-4db3-aec6-c724becfadf1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.414699 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.414746 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.420662 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180e4f84-52ed-4db3-aec6-c724becfadf1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "180e4f84-52ed-4db3-aec6-c724becfadf1" (UID: "180e4f84-52ed-4db3-aec6-c724becfadf1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.420847 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180e4f84-52ed-4db3-aec6-c724becfadf1-kube-api-access-c8mw9" (OuterVolumeSpecName: "kube-api-access-c8mw9") pod "180e4f84-52ed-4db3-aec6-c724becfadf1" (UID: "180e4f84-52ed-4db3-aec6-c724becfadf1"). InnerVolumeSpecName "kube-api-access-c8mw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.423866 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.515949 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/180e4f84-52ed-4db3-aec6-c724becfadf1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.515993 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8mw9\" (UniqueName: \"kubernetes.io/projected/180e4f84-52ed-4db3-aec6-c724becfadf1-kube-api-access-c8mw9\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.616730 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txfl6\" (UniqueName: \"kubernetes.io/projected/2cd169c1-e595-4497-856c-3dd27c1cf551-kube-api-access-txfl6\") pod \"2cd169c1-e595-4497-856c-3dd27c1cf551\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.616826 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-client-ca\") pod \"2cd169c1-e595-4497-856c-3dd27c1cf551\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.616859 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-proxy-ca-bundles\") pod \"2cd169c1-e595-4497-856c-3dd27c1cf551\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.616896 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd169c1-e595-4497-856c-3dd27c1cf551-serving-cert\") pod \"2cd169c1-e595-4497-856c-3dd27c1cf551\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.617010 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-config\") pod \"2cd169c1-e595-4497-856c-3dd27c1cf551\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.618141 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-client-ca" (OuterVolumeSpecName: "client-ca") pod "2cd169c1-e595-4497-856c-3dd27c1cf551" (UID: "2cd169c1-e595-4497-856c-3dd27c1cf551"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.618159 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2cd169c1-e595-4497-856c-3dd27c1cf551" (UID: "2cd169c1-e595-4497-856c-3dd27c1cf551"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.618236 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-config" (OuterVolumeSpecName: "config") pod "2cd169c1-e595-4497-856c-3dd27c1cf551" (UID: "2cd169c1-e595-4497-856c-3dd27c1cf551"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.624446 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd169c1-e595-4497-856c-3dd27c1cf551-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2cd169c1-e595-4497-856c-3dd27c1cf551" (UID: "2cd169c1-e595-4497-856c-3dd27c1cf551"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.624584 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd169c1-e595-4497-856c-3dd27c1cf551-kube-api-access-txfl6" (OuterVolumeSpecName: "kube-api-access-txfl6") pod "2cd169c1-e595-4497-856c-3dd27c1cf551" (UID: "2cd169c1-e595-4497-856c-3dd27c1cf551"). InnerVolumeSpecName "kube-api-access-txfl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.718447 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.718495 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txfl6\" (UniqueName: \"kubernetes.io/projected/2cd169c1-e595-4497-856c-3dd27c1cf551-kube-api-access-txfl6\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.718513 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.718530 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.718546 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd169c1-e595-4497-856c-3dd27c1cf551-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.821246 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66b77f46fb-47b5r"] Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.821734 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd169c1-e595-4497-856c-3dd27c1cf551" containerName="controller-manager" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.821814 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd169c1-e595-4497-856c-3dd27c1cf551" containerName="controller-manager" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.821880 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="registry-server" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.821937 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="registry-server" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.821992 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="extract-utilities" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822047 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="extract-utilities" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822099 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180e4f84-52ed-4db3-aec6-c724becfadf1" containerName="route-controller-manager" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822148 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="180e4f84-52ed-4db3-aec6-c724becfadf1" containerName="route-controller-manager" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822221 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54961f10-93b0-433f-8a7d-b30d69178e9a" containerName="oc" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822275 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="54961f10-93b0-433f-8a7d-b30d69178e9a" containerName="oc" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822343 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="extract-content" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822398 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="extract-content" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822458 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="extract-utilities" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822514 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="extract-utilities" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822647 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="registry-server" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822707 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="registry-server" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822769 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3be356e-94af-47db-a182-dd8a57024619" containerName="oc" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822826 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3be356e-94af-47db-a182-dd8a57024619" containerName="oc" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822886 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="extract-content" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822943 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="extract-content" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823097 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd169c1-e595-4497-856c-3dd27c1cf551" containerName="controller-manager" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823165 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="registry-server" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823268 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="registry-server" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823324 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="54961f10-93b0-433f-8a7d-b30d69178e9a" containerName="oc" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823384 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3be356e-94af-47db-a182-dd8a57024619" containerName="oc" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823439 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="180e4f84-52ed-4db3-aec6-c724becfadf1" containerName="route-controller-manager" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823917 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.828785 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w"] Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.837996 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.850428 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66b77f46fb-47b5r"] Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.862027 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w"] Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.022910 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532bf41b-51fb-4815-ab26-8fb2d12526d2-serving-cert\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.023422 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26k95\" (UniqueName: \"kubernetes.io/projected/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-kube-api-access-26k95\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.023667 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-config\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.023795 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-client-ca\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.023932 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-serving-cert\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.024048 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsn88\" (UniqueName: \"kubernetes.io/projected/532bf41b-51fb-4815-ab26-8fb2d12526d2-kube-api-access-xsn88\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.024254 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-config\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.024313 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-proxy-ca-bundles\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.024346 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-client-ca\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125750 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-config\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125823 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-client-ca\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125854 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-serving-cert\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125878 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsn88\" (UniqueName: \"kubernetes.io/projected/532bf41b-51fb-4815-ab26-8fb2d12526d2-kube-api-access-xsn88\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125905 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-config\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125922 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-proxy-ca-bundles\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125939 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-client-ca\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125983 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532bf41b-51fb-4815-ab26-8fb2d12526d2-serving-cert\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.126001 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26k95\" (UniqueName: \"kubernetes.io/projected/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-kube-api-access-26k95\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.127019 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-client-ca\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.127069 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-client-ca\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.127253 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-proxy-ca-bundles\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.127599 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-config\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.128502 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-config\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.129941 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532bf41b-51fb-4815-ab26-8fb2d12526d2-serving-cert\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.130631 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-serving-cert\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.148329 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsn88\" (UniqueName: \"kubernetes.io/projected/532bf41b-51fb-4815-ab26-8fb2d12526d2-kube-api-access-xsn88\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.150673 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26k95\" (UniqueName: \"kubernetes.io/projected/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-kube-api-access-26k95\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.167512 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.175994 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.249038 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerStarted","Data":"5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568"} Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.252971 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.253418 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.253411 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" event={"ID":"2cd169c1-e595-4497-856c-3dd27c1cf551","Type":"ContainerDied","Data":"fd1c635644f48a9fb2e1288c201f66a5f81d58f274081aff739ae76932d747eb"} Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.253633 4778 scope.go:117] "RemoveContainer" containerID="218eebc0b353408962cb4ccc7a6dbcb7e94147122bce8e36e5a54e046e5b06f8" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.290946 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t74gc" podStartSLOduration=3.641752263 podStartE2EDuration="57.290911592s" podCreationTimestamp="2026-03-18 09:06:40 +0000 UTC" firstStartedPulling="2026-03-18 09:06:42.441171998 +0000 UTC m=+269.015916838" lastFinishedPulling="2026-03-18 09:07:36.090331327 +0000 UTC m=+322.665076167" observedRunningTime="2026-03-18 09:07:37.288664148 +0000 UTC m=+323.863408998" watchObservedRunningTime="2026-03-18 09:07:37.290911592 +0000 UTC m=+323.865656432" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.369351 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24"] Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.381301 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24"] Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.392224 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-785666b9f5-xt96c"] Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.396567 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-785666b9f5-xt96c"] Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.680024 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w"] Mar 18 09:07:37 crc kubenswrapper[4778]: W0318 09:07:37.691675 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aae8a16_f704_4764_bfd7_7a0cfed2eee3.slice/crio-a7918f18849ba2e79d4ef72f4ad9306fd3cd739ca9880924f05fcd02b77c8d3d WatchSource:0}: Error finding container a7918f18849ba2e79d4ef72f4ad9306fd3cd739ca9880924f05fcd02b77c8d3d: Status 404 returned error can't find the container with id a7918f18849ba2e79d4ef72f4ad9306fd3cd739ca9880924f05fcd02b77c8d3d Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.749594 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66b77f46fb-47b5r"] Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.195383 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="180e4f84-52ed-4db3-aec6-c724becfadf1" path="/var/lib/kubelet/pods/180e4f84-52ed-4db3-aec6-c724becfadf1/volumes" Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.196183 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd169c1-e595-4497-856c-3dd27c1cf551" path="/var/lib/kubelet/pods/2cd169c1-e595-4497-856c-3dd27c1cf551/volumes" Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.259044 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" event={"ID":"532bf41b-51fb-4815-ab26-8fb2d12526d2","Type":"ContainerStarted","Data":"8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064"} Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.259105 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" event={"ID":"532bf41b-51fb-4815-ab26-8fb2d12526d2","Type":"ContainerStarted","Data":"2f81410394a4a7c6f801aac32f2e683ca0e93e74a4322b2f6a48fc440a4c2e61"} Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.259262 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.261766 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" event={"ID":"4aae8a16-f704-4764-bfd7-7a0cfed2eee3","Type":"ContainerStarted","Data":"505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9"} Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.261804 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" event={"ID":"4aae8a16-f704-4764-bfd7-7a0cfed2eee3","Type":"ContainerStarted","Data":"a7918f18849ba2e79d4ef72f4ad9306fd3cd739ca9880924f05fcd02b77c8d3d"} Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.265436 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.276274 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" podStartSLOduration=3.276256608 podStartE2EDuration="3.276256608s" podCreationTimestamp="2026-03-18 09:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:38.276250988 +0000 UTC m=+324.850995838" watchObservedRunningTime="2026-03-18 09:07:38.276256608 +0000 UTC m=+324.851001448" Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.314589 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" podStartSLOduration=3.314568244 podStartE2EDuration="3.314568244s" podCreationTimestamp="2026-03-18 09:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:38.312168036 +0000 UTC m=+324.886912886" watchObservedRunningTime="2026-03-18 09:07:38.314568244 +0000 UTC m=+324.889313084" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.268634 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.274944 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.402891 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.402959 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.445428 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.820379 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.820781 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.872903 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.283809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.284078 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.284123 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.284516 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.286612 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.286784 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.287366 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.298474 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.298901 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.311735 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.314757 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.316267 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.320437 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.320859 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.320895 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.334663 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.344972 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.483079 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.483123 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:07:40 crc kubenswrapper[4778]: W0318 09:07:40.608032 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-2a7beaa4634563e06d6bda6369bf6d700ba94f2f182d3fa40d2c50908dc98c55 WatchSource:0}: Error finding container 2a7beaa4634563e06d6bda6369bf6d700ba94f2f182d3fa40d2c50908dc98c55: Status 404 returned error can't find the container with id 2a7beaa4634563e06d6bda6369bf6d700ba94f2f182d3fa40d2c50908dc98c55 Mar 18 09:07:40 crc kubenswrapper[4778]: W0318 09:07:40.850866 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-8c22cc8478d5b3d363a57fe9254a569605c448ef5cd0effed66a11a35b944d72 WatchSource:0}: Error finding container 8c22cc8478d5b3d363a57fe9254a569605c448ef5cd0effed66a11a35b944d72: Status 404 returned error can't find the container with id 8c22cc8478d5b3d363a57fe9254a569605c448ef5cd0effed66a11a35b944d72 Mar 18 09:07:40 crc kubenswrapper[4778]: W0318 09:07:40.873804 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-14d592f01bdff8b8724bfdc1105b2472de2fc9986350a53a853426fe619c6b66 WatchSource:0}: Error finding container 14d592f01bdff8b8724bfdc1105b2472de2fc9986350a53a853426fe619c6b66: Status 404 returned error can't find the container with id 14d592f01bdff8b8724bfdc1105b2472de2fc9986350a53a853426fe619c6b66 Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.974901 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.974957 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.282080 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4fb785520bd9b67b63e2803cd0bbc614c5ddc34e3b3cb7ad83141c2fe7b8fa68"} Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.282137 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"14d592f01bdff8b8724bfdc1105b2472de2fc9986350a53a853426fe619c6b66"} Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.283500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"788b786676596d76f803e3247912acf20e74bd978280cc6d03e54295404de15b"} Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.283528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8c22cc8478d5b3d363a57fe9254a569605c448ef5cd0effed66a11a35b944d72"} Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.283711 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.284856 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2a6479e7a6f9ce91fa00c203b7c22ba9cca43f7b7469d035d135fd1874b5d511"} Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.284885 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2a7beaa4634563e06d6bda6369bf6d700ba94f2f182d3fa40d2c50908dc98c55"} Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.552030 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6kvnk" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="registry-server" probeResult="failure" output=< Mar 18 09:07:41 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:07:41 crc kubenswrapper[4778]: > Mar 18 09:07:42 crc kubenswrapper[4778]: I0318 09:07:42.019047 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t74gc" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="registry-server" probeResult="failure" output=< Mar 18 09:07:42 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:07:42 crc kubenswrapper[4778]: > Mar 18 09:07:42 crc kubenswrapper[4778]: I0318 09:07:42.962135 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rscg9"] Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.296004 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rscg9" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="registry-server" containerID="cri-o://5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be" gracePeriod=2 Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.761274 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.938536 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnfhp\" (UniqueName: \"kubernetes.io/projected/3aedbf59-d23d-409e-9742-09824ed6ef2a-kube-api-access-gnfhp\") pod \"3aedbf59-d23d-409e-9742-09824ed6ef2a\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.938632 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-utilities\") pod \"3aedbf59-d23d-409e-9742-09824ed6ef2a\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.938730 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-catalog-content\") pod \"3aedbf59-d23d-409e-9742-09824ed6ef2a\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.939738 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-utilities" (OuterVolumeSpecName: "utilities") pod "3aedbf59-d23d-409e-9742-09824ed6ef2a" (UID: "3aedbf59-d23d-409e-9742-09824ed6ef2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.946510 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aedbf59-d23d-409e-9742-09824ed6ef2a-kube-api-access-gnfhp" (OuterVolumeSpecName: "kube-api-access-gnfhp") pod "3aedbf59-d23d-409e-9742-09824ed6ef2a" (UID: "3aedbf59-d23d-409e-9742-09824ed6ef2a"). InnerVolumeSpecName "kube-api-access-gnfhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.954086 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.954138 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnfhp\" (UniqueName: \"kubernetes.io/projected/3aedbf59-d23d-409e-9742-09824ed6ef2a-kube-api-access-gnfhp\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.997998 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3aedbf59-d23d-409e-9742-09824ed6ef2a" (UID: "3aedbf59-d23d-409e-9742-09824ed6ef2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.054843 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.308146 4778 generic.go:334] "Generic (PLEG): container finished" podID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerID="5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be" exitCode=0 Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.308223 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rscg9" event={"ID":"3aedbf59-d23d-409e-9742-09824ed6ef2a","Type":"ContainerDied","Data":"5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be"} Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.308258 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rscg9" event={"ID":"3aedbf59-d23d-409e-9742-09824ed6ef2a","Type":"ContainerDied","Data":"535b754da1c090611910ad3734bf58ec9a371b36592f999639d434308dca9ac6"} Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.308279 4778 scope.go:117] "RemoveContainer" containerID="5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.308347 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.338006 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rscg9"] Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.343511 4778 scope.go:117] "RemoveContainer" containerID="dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.347431 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rscg9"] Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.364063 4778 scope.go:117] "RemoveContainer" containerID="1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.397066 4778 scope.go:117] "RemoveContainer" containerID="5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be" Mar 18 09:07:44 crc kubenswrapper[4778]: E0318 09:07:44.397963 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be\": container with ID starting with 5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be not found: ID does not exist" containerID="5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.398319 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be"} err="failed to get container status \"5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be\": rpc error: code = NotFound desc = could not find container \"5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be\": container with ID starting with 5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be not found: ID does not exist" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.398345 4778 scope.go:117] "RemoveContainer" containerID="dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea" Mar 18 09:07:44 crc kubenswrapper[4778]: E0318 09:07:44.398742 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea\": container with ID starting with dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea not found: ID does not exist" containerID="dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.398794 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea"} err="failed to get container status \"dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea\": rpc error: code = NotFound desc = could not find container \"dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea\": container with ID starting with dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea not found: ID does not exist" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.398826 4778 scope.go:117] "RemoveContainer" containerID="1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58" Mar 18 09:07:44 crc kubenswrapper[4778]: E0318 09:07:44.399189 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58\": container with ID starting with 1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58 not found: ID does not exist" containerID="1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.399287 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58"} err="failed to get container status \"1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58\": rpc error: code = NotFound desc = could not find container \"1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58\": container with ID starting with 1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58 not found: ID does not exist" Mar 18 09:07:46 crc kubenswrapper[4778]: I0318 09:07:46.199475 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" path="/var/lib/kubelet/pods/3aedbf59-d23d-409e-9742-09824ed6ef2a/volumes" Mar 18 09:07:50 crc kubenswrapper[4778]: I0318 09:07:50.529241 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:07:50 crc kubenswrapper[4778]: I0318 09:07:50.585529 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:07:51 crc kubenswrapper[4778]: I0318 09:07:51.046622 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:07:51 crc kubenswrapper[4778]: I0318 09:07:51.095830 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:07:51 crc kubenswrapper[4778]: I0318 09:07:51.776989 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t74gc"] Mar 18 09:07:52 crc kubenswrapper[4778]: I0318 09:07:52.366856 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t74gc" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="registry-server" containerID="cri-o://5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568" gracePeriod=2 Mar 18 09:07:52 crc kubenswrapper[4778]: I0318 09:07:52.902158 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:07:52 crc kubenswrapper[4778]: I0318 09:07:52.999174 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-catalog-content\") pod \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " Mar 18 09:07:52 crc kubenswrapper[4778]: I0318 09:07:52.999404 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlv42\" (UniqueName: \"kubernetes.io/projected/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-kube-api-access-dlv42\") pod \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " Mar 18 09:07:52 crc kubenswrapper[4778]: I0318 09:07:52.999538 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-utilities\") pod \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.000541 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-utilities" (OuterVolumeSpecName: "utilities") pod "01828fdf-ef1b-44e3-905b-aec0c6aaa44f" (UID: "01828fdf-ef1b-44e3-905b-aec0c6aaa44f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.009003 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-kube-api-access-dlv42" (OuterVolumeSpecName: "kube-api-access-dlv42") pod "01828fdf-ef1b-44e3-905b-aec0c6aaa44f" (UID: "01828fdf-ef1b-44e3-905b-aec0c6aaa44f"). InnerVolumeSpecName "kube-api-access-dlv42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.101415 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlv42\" (UniqueName: \"kubernetes.io/projected/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-kube-api-access-dlv42\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.101458 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.175405 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01828fdf-ef1b-44e3-905b-aec0c6aaa44f" (UID: "01828fdf-ef1b-44e3-905b-aec0c6aaa44f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.203164 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.383958 4778 generic.go:334] "Generic (PLEG): container finished" podID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerID="5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568" exitCode=0 Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.384051 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerDied","Data":"5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568"} Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.384115 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerDied","Data":"3806b4b94100c7a5155593ac6313aa0ecd333d565872aab11d373225ceac8468"} Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.384162 4778 scope.go:117] "RemoveContainer" containerID="5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.384165 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.421272 4778 scope.go:117] "RemoveContainer" containerID="fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.431064 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" podUID="db81860d-bcb7-4a56-a935-544dbc4be29b" containerName="oauth-openshift" containerID="cri-o://fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8" gracePeriod=15 Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.445121 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t74gc"] Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.459782 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t74gc"] Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.468690 4778 scope.go:117] "RemoveContainer" containerID="041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.497056 4778 scope.go:117] "RemoveContainer" containerID="5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568" Mar 18 09:07:53 crc kubenswrapper[4778]: E0318 09:07:53.498289 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568\": container with ID starting with 5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568 not found: ID does not exist" containerID="5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.498362 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568"} err="failed to get container status \"5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568\": rpc error: code = NotFound desc = could not find container \"5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568\": container with ID starting with 5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568 not found: ID does not exist" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.498410 4778 scope.go:117] "RemoveContainer" containerID="fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731" Mar 18 09:07:53 crc kubenswrapper[4778]: E0318 09:07:53.499000 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731\": container with ID starting with fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731 not found: ID does not exist" containerID="fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.499101 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731"} err="failed to get container status \"fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731\": rpc error: code = NotFound desc = could not find container \"fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731\": container with ID starting with fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731 not found: ID does not exist" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.499236 4778 scope.go:117] "RemoveContainer" containerID="041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118" Mar 18 09:07:53 crc kubenswrapper[4778]: E0318 09:07:53.500766 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118\": container with ID starting with 041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118 not found: ID does not exist" containerID="041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.500871 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118"} err="failed to get container status \"041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118\": rpc error: code = NotFound desc = could not find container \"041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118\": container with ID starting with 041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118 not found: ID does not exist" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.950748 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015611 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-session\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015697 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-trusted-ca-bundle\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015727 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-idp-0-file-data\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015749 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-provider-selection\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015790 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-ocp-branding-template\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015869 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-policies\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015894 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-serving-cert\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015928 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-service-ca\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015958 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-error\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015985 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-dir\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.016004 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-router-certs\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.016031 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-login\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.016059 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxpv5\" (UniqueName: \"kubernetes.io/projected/db81860d-bcb7-4a56-a935-544dbc4be29b-kube-api-access-jxpv5\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.016079 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-cliconfig\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.017172 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.017331 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.018766 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.018790 4778 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.020138 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.020233 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.021861 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.022334 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.023184 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.023411 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.024980 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.025445 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.025738 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.026223 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.026929 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.040112 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db81860d-bcb7-4a56-a935-544dbc4be29b-kube-api-access-jxpv5" (OuterVolumeSpecName: "kube-api-access-jxpv5") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "kube-api-access-jxpv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120036 4778 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120080 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120302 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120312 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxpv5\" (UniqueName: \"kubernetes.io/projected/db81860d-bcb7-4a56-a935-544dbc4be29b-kube-api-access-jxpv5\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120321 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120330 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120342 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120353 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120364 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120375 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120385 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120393 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.193601 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" path="/var/lib/kubelet/pods/01828fdf-ef1b-44e3-905b-aec0c6aaa44f/volumes" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.398095 4778 generic.go:334] "Generic (PLEG): container finished" podID="db81860d-bcb7-4a56-a935-544dbc4be29b" containerID="fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8" exitCode=0 Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.398171 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.398171 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" event={"ID":"db81860d-bcb7-4a56-a935-544dbc4be29b","Type":"ContainerDied","Data":"fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8"} Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.398313 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" event={"ID":"db81860d-bcb7-4a56-a935-544dbc4be29b","Type":"ContainerDied","Data":"233e58e62c8d40d87963329725284bd0d629e6646b095fe46a9712b711f0c101"} Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.398355 4778 scope.go:117] "RemoveContainer" containerID="fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.429921 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5dpv"] Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.435238 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5dpv"] Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.444814 4778 scope.go:117] "RemoveContainer" containerID="fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8" Mar 18 09:07:54 crc kubenswrapper[4778]: E0318 09:07:54.445176 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8\": container with ID starting with fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8 not found: ID does not exist" containerID="fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.445236 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8"} err="failed to get container status \"fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8\": rpc error: code = NotFound desc = could not find container \"fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8\": container with ID starting with fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8 not found: ID does not exist" Mar 18 09:07:55 crc kubenswrapper[4778]: I0318 09:07:55.674076 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66b77f46fb-47b5r"] Mar 18 09:07:55 crc kubenswrapper[4778]: I0318 09:07:55.674516 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" podUID="532bf41b-51fb-4815-ab26-8fb2d12526d2" containerName="controller-manager" containerID="cri-o://8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064" gracePeriod=30 Mar 18 09:07:55 crc kubenswrapper[4778]: I0318 09:07:55.744790 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w"] Mar 18 09:07:55 crc kubenswrapper[4778]: I0318 09:07:55.745008 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" podUID="4aae8a16-f704-4764-bfd7-7a0cfed2eee3" containerName="route-controller-manager" containerID="cri-o://505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9" gracePeriod=30 Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.195649 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db81860d-bcb7-4a56-a935-544dbc4be29b" path="/var/lib/kubelet/pods/db81860d-bcb7-4a56-a935-544dbc4be29b/volumes" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.243873 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.248228 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357055 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-proxy-ca-bundles\") pod \"532bf41b-51fb-4815-ab26-8fb2d12526d2\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357181 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532bf41b-51fb-4815-ab26-8fb2d12526d2-serving-cert\") pod \"532bf41b-51fb-4815-ab26-8fb2d12526d2\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357249 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26k95\" (UniqueName: \"kubernetes.io/projected/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-kube-api-access-26k95\") pod \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357300 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsn88\" (UniqueName: \"kubernetes.io/projected/532bf41b-51fb-4815-ab26-8fb2d12526d2-kube-api-access-xsn88\") pod \"532bf41b-51fb-4815-ab26-8fb2d12526d2\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357341 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-config\") pod \"532bf41b-51fb-4815-ab26-8fb2d12526d2\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357380 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-config\") pod \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-serving-cert\") pod \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357452 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-client-ca\") pod \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357472 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-client-ca\") pod \"532bf41b-51fb-4815-ab26-8fb2d12526d2\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.358598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "532bf41b-51fb-4815-ab26-8fb2d12526d2" (UID: "532bf41b-51fb-4815-ab26-8fb2d12526d2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.358674 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-client-ca" (OuterVolumeSpecName: "client-ca") pod "532bf41b-51fb-4815-ab26-8fb2d12526d2" (UID: "532bf41b-51fb-4815-ab26-8fb2d12526d2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.358798 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-config" (OuterVolumeSpecName: "config") pod "532bf41b-51fb-4815-ab26-8fb2d12526d2" (UID: "532bf41b-51fb-4815-ab26-8fb2d12526d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.358926 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-client-ca" (OuterVolumeSpecName: "client-ca") pod "4aae8a16-f704-4764-bfd7-7a0cfed2eee3" (UID: "4aae8a16-f704-4764-bfd7-7a0cfed2eee3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.359057 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-config" (OuterVolumeSpecName: "config") pod "4aae8a16-f704-4764-bfd7-7a0cfed2eee3" (UID: "4aae8a16-f704-4764-bfd7-7a0cfed2eee3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.362726 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4aae8a16-f704-4764-bfd7-7a0cfed2eee3" (UID: "4aae8a16-f704-4764-bfd7-7a0cfed2eee3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.363036 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532bf41b-51fb-4815-ab26-8fb2d12526d2-kube-api-access-xsn88" (OuterVolumeSpecName: "kube-api-access-xsn88") pod "532bf41b-51fb-4815-ab26-8fb2d12526d2" (UID: "532bf41b-51fb-4815-ab26-8fb2d12526d2"). InnerVolumeSpecName "kube-api-access-xsn88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.363169 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532bf41b-51fb-4815-ab26-8fb2d12526d2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "532bf41b-51fb-4815-ab26-8fb2d12526d2" (UID: "532bf41b-51fb-4815-ab26-8fb2d12526d2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.364766 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-kube-api-access-26k95" (OuterVolumeSpecName: "kube-api-access-26k95") pod "4aae8a16-f704-4764-bfd7-7a0cfed2eee3" (UID: "4aae8a16-f704-4764-bfd7-7a0cfed2eee3"). InnerVolumeSpecName "kube-api-access-26k95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.424181 4778 generic.go:334] "Generic (PLEG): container finished" podID="4aae8a16-f704-4764-bfd7-7a0cfed2eee3" containerID="505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9" exitCode=0 Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.424264 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" event={"ID":"4aae8a16-f704-4764-bfd7-7a0cfed2eee3","Type":"ContainerDied","Data":"505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9"} Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.424290 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.424336 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" event={"ID":"4aae8a16-f704-4764-bfd7-7a0cfed2eee3","Type":"ContainerDied","Data":"a7918f18849ba2e79d4ef72f4ad9306fd3cd739ca9880924f05fcd02b77c8d3d"} Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.424369 4778 scope.go:117] "RemoveContainer" containerID="505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.426420 4778 generic.go:334] "Generic (PLEG): container finished" podID="532bf41b-51fb-4815-ab26-8fb2d12526d2" containerID="8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064" exitCode=0 Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.426458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" event={"ID":"532bf41b-51fb-4815-ab26-8fb2d12526d2","Type":"ContainerDied","Data":"8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064"} Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.426497 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" event={"ID":"532bf41b-51fb-4815-ab26-8fb2d12526d2","Type":"ContainerDied","Data":"2f81410394a4a7c6f801aac32f2e683ca0e93e74a4322b2f6a48fc440a4c2e61"} Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.426572 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.450603 4778 scope.go:117] "RemoveContainer" containerID="505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.451356 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9\": container with ID starting with 505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9 not found: ID does not exist" containerID="505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.451473 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9"} err="failed to get container status \"505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9\": rpc error: code = NotFound desc = could not find container \"505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9\": container with ID starting with 505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9 not found: ID does not exist" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.451579 4778 scope.go:117] "RemoveContainer" containerID="8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458853 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458899 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532bf41b-51fb-4815-ab26-8fb2d12526d2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458918 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26k95\" (UniqueName: \"kubernetes.io/projected/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-kube-api-access-26k95\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458939 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsn88\" (UniqueName: \"kubernetes.io/projected/532bf41b-51fb-4815-ab26-8fb2d12526d2-kube-api-access-xsn88\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458958 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458974 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458989 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.459005 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.459022 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.460352 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.462949 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.474892 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66b77f46fb-47b5r"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.478650 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66b77f46fb-47b5r"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.485024 4778 scope.go:117] "RemoveContainer" containerID="8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.485652 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064\": container with ID starting with 8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064 not found: ID does not exist" containerID="8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.485783 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064"} err="failed to get container status \"8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064\": rpc error: code = NotFound desc = could not find container \"8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064\": container with ID starting with 8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064 not found: ID does not exist" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.835311 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z"] Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.836643 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532bf41b-51fb-4815-ab26-8fb2d12526d2" containerName="controller-manager" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.836714 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="532bf41b-51fb-4815-ab26-8fb2d12526d2" containerName="controller-manager" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.836772 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="extract-content" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.836825 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="extract-content" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.836891 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="extract-utilities" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.836949 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="extract-utilities" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.837016 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="extract-content" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837074 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="extract-content" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.837126 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db81860d-bcb7-4a56-a935-544dbc4be29b" containerName="oauth-openshift" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837176 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="db81860d-bcb7-4a56-a935-544dbc4be29b" containerName="oauth-openshift" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.837254 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="registry-server" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837331 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="registry-server" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.837407 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="extract-utilities" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837461 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="extract-utilities" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.837514 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aae8a16-f704-4764-bfd7-7a0cfed2eee3" containerName="route-controller-manager" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837564 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aae8a16-f704-4764-bfd7-7a0cfed2eee3" containerName="route-controller-manager" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.837618 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="registry-server" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837668 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="registry-server" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837819 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="registry-server" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837880 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="db81860d-bcb7-4a56-a935-544dbc4be29b" containerName="oauth-openshift" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837937 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="532bf41b-51fb-4815-ab26-8fb2d12526d2" containerName="controller-manager" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837996 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="registry-server" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.838053 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aae8a16-f704-4764-bfd7-7a0cfed2eee3" containerName="route-controller-manager" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.838400 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.838535 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.838860 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-67556b9b9b-7qmr6"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.839132 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.839542 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.843215 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.843327 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.843385 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.843474 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.843484 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.843544 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.844068 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.844329 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.844438 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.844135 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.844139 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.844184 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.845989 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.846008 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.846221 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.846311 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.846413 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.846770 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.847561 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.847815 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.848730 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.849262 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.850284 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.852655 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.858792 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.861987 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.868129 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-67556b9b9b-7qmr6"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.880184 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.891633 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.898266 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.903446 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965635 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-error\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965705 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-session\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965734 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-proxy-ca-bundles\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965757 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965813 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965834 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-audit-dir\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-service-ca\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965973 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966013 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a393524-f81f-4ff5-b836-29188770f717-config\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966165 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq69g\" (UniqueName: \"kubernetes.io/projected/bb3a9066-971e-467b-bb54-8ba8b720781e-kube-api-access-mq69g\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966186 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966277 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-client-ca\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966317 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvdq9\" (UniqueName: \"kubernetes.io/projected/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-kube-api-access-pvdq9\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966342 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a393524-f81f-4ff5-b836-29188770f717-serving-cert\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966372 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-login\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966402 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltjkk\" (UniqueName: \"kubernetes.io/projected/6a393524-f81f-4ff5-b836-29188770f717-kube-api-access-ltjkk\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966432 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a393524-f81f-4ff5-b836-29188770f717-client-ca\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966509 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb3a9066-971e-467b-bb54-8ba8b720781e-serving-cert\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-config\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966690 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966752 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-router-certs\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966784 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-audit-policies\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068456 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-client-ca\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068514 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvdq9\" (UniqueName: \"kubernetes.io/projected/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-kube-api-access-pvdq9\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068536 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a393524-f81f-4ff5-b836-29188770f717-serving-cert\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068557 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-login\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068576 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltjkk\" (UniqueName: \"kubernetes.io/projected/6a393524-f81f-4ff5-b836-29188770f717-kube-api-access-ltjkk\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068594 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a393524-f81f-4ff5-b836-29188770f717-client-ca\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068613 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb3a9066-971e-467b-bb54-8ba8b720781e-serving-cert\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068630 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-config\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068651 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-router-certs\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068689 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-audit-policies\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068707 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-error\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068730 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-session\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068747 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-proxy-ca-bundles\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068767 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068786 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068807 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-audit-dir\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068823 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-service-ca\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068862 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068879 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a393524-f81f-4ff5-b836-29188770f717-config\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068917 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq69g\" (UniqueName: \"kubernetes.io/projected/bb3a9066-971e-467b-bb54-8ba8b720781e-kube-api-access-mq69g\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068935 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.069673 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.070276 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-client-ca\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.071560 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-service-ca\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.072317 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-config\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.072704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.072733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-audit-dir\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.073906 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-proxy-ca-bundles\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.074353 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.074628 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.074775 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a393524-f81f-4ff5-b836-29188770f717-serving-cert\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.075047 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a393524-f81f-4ff5-b836-29188770f717-client-ca\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.075125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-session\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.075683 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-audit-policies\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.076490 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-login\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.076552 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a393524-f81f-4ff5-b836-29188770f717-config\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.077220 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-router-certs\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.079303 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.080554 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-error\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.086549 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb3a9066-971e-467b-bb54-8ba8b720781e-serving-cert\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.089817 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.094254 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvdq9\" (UniqueName: \"kubernetes.io/projected/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-kube-api-access-pvdq9\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.101574 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq69g\" (UniqueName: \"kubernetes.io/projected/bb3a9066-971e-467b-bb54-8ba8b720781e-kube-api-access-mq69g\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.101577 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltjkk\" (UniqueName: \"kubernetes.io/projected/6a393524-f81f-4ff5-b836-29188770f717-kube-api-access-ltjkk\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.158878 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.186814 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.199779 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.424413 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z"] Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.684317 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-67556b9b9b-7qmr6"] Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.733272 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp"] Mar 18 09:07:57 crc kubenswrapper[4778]: W0318 09:07:57.748064 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb3a9066_971e_467b_bb54_8ba8b720781e.slice/crio-1b681372cb607effc0785cfbf4328cc38b50f46b450f3104cbd0a9baf4f13f20 WatchSource:0}: Error finding container 1b681372cb607effc0785cfbf4328cc38b50f46b450f3104cbd0a9baf4f13f20: Status 404 returned error can't find the container with id 1b681372cb607effc0785cfbf4328cc38b50f46b450f3104cbd0a9baf4f13f20 Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.196086 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aae8a16-f704-4764-bfd7-7a0cfed2eee3" path="/var/lib/kubelet/pods/4aae8a16-f704-4764-bfd7-7a0cfed2eee3/volumes" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.197418 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532bf41b-51fb-4815-ab26-8fb2d12526d2" path="/var/lib/kubelet/pods/532bf41b-51fb-4815-ab26-8fb2d12526d2/volumes" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.453323 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" event={"ID":"6a393524-f81f-4ff5-b836-29188770f717","Type":"ContainerStarted","Data":"4f421a15a28224b1ea94119a9d5502c90bb61897e2bd67989de460b3e188a852"} Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.453388 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" event={"ID":"6a393524-f81f-4ff5-b836-29188770f717","Type":"ContainerStarted","Data":"770bb3bdcd9b98c73620ddf0e1e1681db38f82724a883f7e2649d58a06005318"} Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.453696 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.455858 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" event={"ID":"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec","Type":"ContainerStarted","Data":"c81a3d973fd0941efd92be1e169e4358139cbc96cb0f800d5731165eeeddacb6"} Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.455906 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" event={"ID":"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec","Type":"ContainerStarted","Data":"c7d26979f02bd6139800747cd734de010dde8aa3b6ff6027b691b8bd7e548820"} Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.456082 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.457800 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" event={"ID":"bb3a9066-971e-467b-bb54-8ba8b720781e","Type":"ContainerStarted","Data":"8d81186147e1ac52f0e6fd8ba5b0f75252b9187d8d5cdf7266a08aeaa2bd6933"} Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.457865 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" event={"ID":"bb3a9066-971e-467b-bb54-8ba8b720781e","Type":"ContainerStarted","Data":"1b681372cb607effc0785cfbf4328cc38b50f46b450f3104cbd0a9baf4f13f20"} Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.458078 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.461817 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.465536 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.482284 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" podStartSLOduration=3.482263398 podStartE2EDuration="3.482263398s" podCreationTimestamp="2026-03-18 09:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:58.480016987 +0000 UTC m=+345.054761837" watchObservedRunningTime="2026-03-18 09:07:58.482263398 +0000 UTC m=+345.057008238" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.549573 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" podStartSLOduration=30.549558374 podStartE2EDuration="30.549558374s" podCreationTimestamp="2026-03-18 09:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:58.524664719 +0000 UTC m=+345.099409579" watchObservedRunningTime="2026-03-18 09:07:58.549558374 +0000 UTC m=+345.124303214" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.550441 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" podStartSLOduration=3.550436698 podStartE2EDuration="3.550436698s" podCreationTimestamp="2026-03-18 09:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:58.547375015 +0000 UTC m=+345.122119845" watchObservedRunningTime="2026-03-18 09:07:58.550436698 +0000 UTC m=+345.125181538" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.845832 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.144327 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563748-8q2hs"] Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.148156 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.150887 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563748-8q2hs"] Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.156589 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.156885 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.157151 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.234518 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx4kt\" (UniqueName: \"kubernetes.io/projected/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126-kube-api-access-sx4kt\") pod \"auto-csr-approver-29563748-8q2hs\" (UID: \"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126\") " pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.337149 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx4kt\" (UniqueName: \"kubernetes.io/projected/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126-kube-api-access-sx4kt\") pod \"auto-csr-approver-29563748-8q2hs\" (UID: \"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126\") " pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.369403 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx4kt\" (UniqueName: \"kubernetes.io/projected/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126-kube-api-access-sx4kt\") pod \"auto-csr-approver-29563748-8q2hs\" (UID: \"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126\") " pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.466271 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.933528 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563748-8q2hs"] Mar 18 09:08:00 crc kubenswrapper[4778]: W0318 09:08:00.939411 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d4f1c72_13f2_47ff_94aa_9b4e91f2e126.slice/crio-85aa2ca3e5023179d42775d87caa49a9a68b58d0ff03b6202e5dd6a676230129 WatchSource:0}: Error finding container 85aa2ca3e5023179d42775d87caa49a9a68b58d0ff03b6202e5dd6a676230129: Status 404 returned error can't find the container with id 85aa2ca3e5023179d42775d87caa49a9a68b58d0ff03b6202e5dd6a676230129 Mar 18 09:08:01 crc kubenswrapper[4778]: I0318 09:08:01.484353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" event={"ID":"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126","Type":"ContainerStarted","Data":"85aa2ca3e5023179d42775d87caa49a9a68b58d0ff03b6202e5dd6a676230129"} Mar 18 09:08:02 crc kubenswrapper[4778]: I0318 09:08:02.502524 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" event={"ID":"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126","Type":"ContainerStarted","Data":"8bac8cffd4e1a60ba2bf0f1dd076bc9102bdfd1bb1f8e85a389ddde5e582bd3e"} Mar 18 09:08:02 crc kubenswrapper[4778]: I0318 09:08:02.523137 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" podStartSLOduration=1.472313908 podStartE2EDuration="2.5231159s" podCreationTimestamp="2026-03-18 09:08:00 +0000 UTC" firstStartedPulling="2026-03-18 09:08:00.942108182 +0000 UTC m=+347.516853022" lastFinishedPulling="2026-03-18 09:08:01.992910144 +0000 UTC m=+348.567655014" observedRunningTime="2026-03-18 09:08:02.521178158 +0000 UTC m=+349.095922998" watchObservedRunningTime="2026-03-18 09:08:02.5231159 +0000 UTC m=+349.097860760" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.523152 4778 generic.go:334] "Generic (PLEG): container finished" podID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" containerID="8bac8cffd4e1a60ba2bf0f1dd076bc9102bdfd1bb1f8e85a389ddde5e582bd3e" exitCode=0 Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.523220 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" event={"ID":"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126","Type":"ContainerDied","Data":"8bac8cffd4e1a60ba2bf0f1dd076bc9102bdfd1bb1f8e85a389ddde5e582bd3e"} Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.590879 4778 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.592312 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.631115 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.689851 4778 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.690151 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf" gracePeriod=15 Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.690321 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8" gracePeriod=15 Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.690369 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5" gracePeriod=15 Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.690442 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b" gracePeriod=15 Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.690427 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e" gracePeriod=15 Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.693626 4778 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.693995 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694425 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694448 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694464 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694478 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694491 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694511 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694525 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694582 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694599 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694618 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694632 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694649 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694662 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694682 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694695 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694715 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694728 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694905 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694923 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694942 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694960 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694982 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.695002 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.695016 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.695110 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.695411 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.695433 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.695606 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.700588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.700642 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.700690 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.700727 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.700750 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.802678 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.802795 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.802845 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.802927 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.802927 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.802997 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803020 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803078 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803372 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803442 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803584 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803987 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.904619 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.904692 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.904742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.904798 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.904826 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.904832 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.929329 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: W0318 09:08:03.952623 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-fab2138804c88ae433897923ae9e791bf072578ea0fdaf6820e27a9aeaed9215 WatchSource:0}: Error finding container fab2138804c88ae433897923ae9e791bf072578ea0fdaf6820e27a9aeaed9215: Status 404 returned error can't find the container with id fab2138804c88ae433897923ae9e791bf072578ea0fdaf6820e27a9aeaed9215 Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.956722 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.70:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189de45575f14002 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:08:03.955941378 +0000 UTC m=+350.530686228,LastTimestamp:2026-03-18 09:08:03.955941378 +0000 UTC m=+350.530686228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.193492 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.193991 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.194350 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.548885 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.551084 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.552773 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8" exitCode=0 Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.552804 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b" exitCode=0 Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.552817 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e" exitCode=0 Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.552830 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5" exitCode=2 Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.552883 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.560188 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c87368359c8add64535225d3bbfde067e9b4f8b557eb6da52467546b92f78032"} Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.560251 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fab2138804c88ae433897923ae9e791bf072578ea0fdaf6820e27a9aeaed9215"} Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.561263 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.562022 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.564818 4778 generic.go:334] "Generic (PLEG): container finished" podID="93723e0d-2243-4390-a667-8a080325205f" containerID="e2b50472335d82d0cbf47f1b30666ae7b50519462f89aadf15ef33cf94804236" exitCode=0 Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.565039 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"93723e0d-2243-4390-a667-8a080325205f","Type":"ContainerDied","Data":"e2b50472335d82d0cbf47f1b30666ae7b50519462f89aadf15ef33cf94804236"} Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.565652 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.566405 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.566960 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.923880 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.924576 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.925107 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.925372 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.020078 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx4kt\" (UniqueName: \"kubernetes.io/projected/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126-kube-api-access-sx4kt\") pod \"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126\" (UID: \"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126\") " Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.029406 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126-kube-api-access-sx4kt" (OuterVolumeSpecName: "kube-api-access-sx4kt") pod "5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" (UID: "5d4f1c72-13f2-47ff-94aa-9b4e91f2e126"). InnerVolumeSpecName "kube-api-access-sx4kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.123317 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx4kt\" (UniqueName: \"kubernetes.io/projected/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126-kube-api-access-sx4kt\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.581759 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.585518 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" event={"ID":"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126","Type":"ContainerDied","Data":"85aa2ca3e5023179d42775d87caa49a9a68b58d0ff03b6202e5dd6a676230129"} Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.585568 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85aa2ca3e5023179d42775d87caa49a9a68b58d0ff03b6202e5dd6a676230129" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.585565 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.605892 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.606568 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.607125 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.172472 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.173357 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.173575 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.173796 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.178134 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.178904 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.179225 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.179435 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.179722 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.179954 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.253336 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93723e0d-2243-4390-a667-8a080325205f-kube-api-access\") pod \"93723e0d-2243-4390-a667-8a080325205f\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.253479 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-kubelet-dir\") pod \"93723e0d-2243-4390-a667-8a080325205f\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.253556 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-var-lock\") pod \"93723e0d-2243-4390-a667-8a080325205f\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.253731 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "93723e0d-2243-4390-a667-8a080325205f" (UID: "93723e0d-2243-4390-a667-8a080325205f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.253815 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-var-lock" (OuterVolumeSpecName: "var-lock") pod "93723e0d-2243-4390-a667-8a080325205f" (UID: "93723e0d-2243-4390-a667-8a080325205f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.261390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93723e0d-2243-4390-a667-8a080325205f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "93723e0d-2243-4390-a667-8a080325205f" (UID: "93723e0d-2243-4390-a667-8a080325205f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.355598 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.355739 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.355783 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.355853 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.355854 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.355870 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.356578 4778 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.356608 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.356628 4778 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.356645 4778 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.356663 4778 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.356680 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93723e0d-2243-4390-a667-8a080325205f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.593917 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"93723e0d-2243-4390-a667-8a080325205f","Type":"ContainerDied","Data":"e6f7bb7ae014c0c468ab5d5f0830ec452ff1c1962710492d6af9f507d40acc53"} Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.593999 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6f7bb7ae014c0c468ab5d5f0830ec452ff1c1962710492d6af9f507d40acc53" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.593964 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.599178 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.599914 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf" exitCode=0 Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.599984 4778 scope.go:117] "RemoveContainer" containerID="0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.600177 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.600984 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.601295 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.601793 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.602427 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.621522 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.621785 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.622266 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.623243 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.624392 4778 scope.go:117] "RemoveContainer" containerID="72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.631032 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.631908 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.632739 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.633183 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.642125 4778 scope.go:117] "RemoveContainer" containerID="e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.661729 4778 scope.go:117] "RemoveContainer" containerID="82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.677659 4778 scope.go:117] "RemoveContainer" containerID="3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.699887 4778 scope.go:117] "RemoveContainer" containerID="7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.731306 4778 scope.go:117] "RemoveContainer" containerID="0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8" Mar 18 09:08:06 crc kubenswrapper[4778]: E0318 09:08:06.732020 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\": container with ID starting with 0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8 not found: ID does not exist" containerID="0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.732071 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8"} err="failed to get container status \"0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\": rpc error: code = NotFound desc = could not find container \"0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\": container with ID starting with 0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8 not found: ID does not exist" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.732108 4778 scope.go:117] "RemoveContainer" containerID="72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b" Mar 18 09:08:06 crc kubenswrapper[4778]: E0318 09:08:06.732842 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\": container with ID starting with 72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b not found: ID does not exist" containerID="72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.732913 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b"} err="failed to get container status \"72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\": rpc error: code = NotFound desc = could not find container \"72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\": container with ID starting with 72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b not found: ID does not exist" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.732954 4778 scope.go:117] "RemoveContainer" containerID="e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e" Mar 18 09:08:06 crc kubenswrapper[4778]: E0318 09:08:06.734606 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\": container with ID starting with e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e not found: ID does not exist" containerID="e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.734677 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e"} err="failed to get container status \"e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\": rpc error: code = NotFound desc = could not find container \"e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\": container with ID starting with e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e not found: ID does not exist" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.734724 4778 scope.go:117] "RemoveContainer" containerID="82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5" Mar 18 09:08:06 crc kubenswrapper[4778]: E0318 09:08:06.735291 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\": container with ID starting with 82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5 not found: ID does not exist" containerID="82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.735342 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5"} err="failed to get container status \"82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\": rpc error: code = NotFound desc = could not find container \"82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\": container with ID starting with 82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5 not found: ID does not exist" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.735373 4778 scope.go:117] "RemoveContainer" containerID="3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf" Mar 18 09:08:06 crc kubenswrapper[4778]: E0318 09:08:06.736140 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\": container with ID starting with 3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf not found: ID does not exist" containerID="3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.736241 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf"} err="failed to get container status \"3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\": rpc error: code = NotFound desc = could not find container \"3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\": container with ID starting with 3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf not found: ID does not exist" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.736296 4778 scope.go:117] "RemoveContainer" containerID="7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88" Mar 18 09:08:06 crc kubenswrapper[4778]: E0318 09:08:06.736958 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\": container with ID starting with 7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88 not found: ID does not exist" containerID="7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.737011 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88"} err="failed to get container status \"7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\": rpc error: code = NotFound desc = could not find container \"7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\": container with ID starting with 7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88 not found: ID does not exist" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.036608 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.037832 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.038442 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.038849 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.039362 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:08 crc kubenswrapper[4778]: I0318 09:08:08.039417 4778 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.039807 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="200ms" Mar 18 09:08:08 crc kubenswrapper[4778]: I0318 09:08:08.199360 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.241291 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="400ms" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.643043 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="800ms" Mar 18 09:08:09 crc kubenswrapper[4778]: E0318 09:08:09.445371 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="1.6s" Mar 18 09:08:10 crc kubenswrapper[4778]: E0318 09:08:10.022679 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.70:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189de45575f14002 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:08:03.955941378 +0000 UTC m=+350.530686228,LastTimestamp:2026-03-18 09:08:03.955941378 +0000 UTC m=+350.530686228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:08:10 crc kubenswrapper[4778]: I0318 09:08:10.354493 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:08:10 crc kubenswrapper[4778]: I0318 09:08:10.355292 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:10 crc kubenswrapper[4778]: I0318 09:08:10.355916 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:10 crc kubenswrapper[4778]: I0318 09:08:10.356499 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:10 crc kubenswrapper[4778]: I0318 09:08:10.357176 4778 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:11 crc kubenswrapper[4778]: E0318 09:08:11.047353 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="3.2s" Mar 18 09:08:12 crc kubenswrapper[4778]: E0318 09:08:12.263562 4778 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.70:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" volumeName="registry-storage" Mar 18 09:08:14 crc kubenswrapper[4778]: I0318 09:08:14.190103 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:14 crc kubenswrapper[4778]: I0318 09:08:14.190801 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:14 crc kubenswrapper[4778]: I0318 09:08:14.191342 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:14 crc kubenswrapper[4778]: I0318 09:08:14.191821 4778 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:14 crc kubenswrapper[4778]: E0318 09:08:14.249239 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="6.4s" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.186894 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.188391 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.189329 4778 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.189750 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.190161 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.204685 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.204714 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.205087 4778 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.205585 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.702786 4778 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="15022c61f8f7848b9d5cb82bc7482c0948ac803924a28c75d30573c347fa2a6e" exitCode=0 Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.702910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"15022c61f8f7848b9d5cb82bc7482c0948ac803924a28c75d30573c347fa2a6e"} Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.703164 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"724e531da6d62f3549d123165683968c32317a115c58f5945d2904ceae2a7e48"} Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.703444 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.703457 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.703953 4778 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.704568 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.705457 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.706039 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.706304 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.706742 4778 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.708048 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.708121 4778 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54" exitCode=1 Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.708162 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54"} Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.708615 4778 scope.go:117] "RemoveContainer" containerID="c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.709286 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.710269 4778 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.710891 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.711493 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.711860 4778 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.817241 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:08:18Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:08:18Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:08:18Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:08:18Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.817627 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.817821 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.818016 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.818316 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.818348 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.956312 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.717845 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.719631 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.719820 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae01dca2780474bc416cdfc4c7ca33c6504d1a3a39842df33f21a753ed00995f"} Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.725829 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0d59ab003f59ea57c623f253dd45a3b041f10ec06eb83b3aca1d3fae860942b9"} Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.726058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"44161b3eb080fd781ada3e0dece3563642ea32c5314cf590037a6c343e05fe59"} Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.726141 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"aecc03e0c76ad5620aa8cc8df78dd74eca541bf7a499fb18e311a3efc8ed4311"} Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.726226 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2f198513c437443df041b1f482fe066eaa858c0b9ef973208727b6d4f47d9f0d"} Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.801667 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.802066 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.802224 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 18 09:08:20 crc kubenswrapper[4778]: I0318 09:08:20.733658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e4b8a3499cf6a41fd3046003b4b680c90778652113a61109425c607e0555b37c"} Mar 18 09:08:20 crc kubenswrapper[4778]: I0318 09:08:20.734111 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:20 crc kubenswrapper[4778]: I0318 09:08:20.734137 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:22 crc kubenswrapper[4778]: I0318 09:08:22.724060 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:08:22 crc kubenswrapper[4778]: I0318 09:08:22.727673 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 09:08:22 crc kubenswrapper[4778]: I0318 09:08:22.743982 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:08:23 crc kubenswrapper[4778]: I0318 09:08:23.010926 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 09:08:23 crc kubenswrapper[4778]: I0318 09:08:23.019243 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:08:23 crc kubenswrapper[4778]: I0318 09:08:23.206378 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:23 crc kubenswrapper[4778]: I0318 09:08:23.207076 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:23 crc kubenswrapper[4778]: I0318 09:08:23.212957 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:23 crc kubenswrapper[4778]: W0318 09:08:23.586528 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2d5c312_2314_46d7_8ba2_64b621b0c2c7.slice/crio-503e2ac35b2cb29f79d200576b5f4c19c9e97c546766f821bb1db339e03c2980 WatchSource:0}: Error finding container 503e2ac35b2cb29f79d200576b5f4c19c9e97c546766f821bb1db339e03c2980: Status 404 returned error can't find the container with id 503e2ac35b2cb29f79d200576b5f4c19c9e97c546766f821bb1db339e03c2980 Mar 18 09:08:23 crc kubenswrapper[4778]: I0318 09:08:23.758849 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" event={"ID":"a2d5c312-2314-46d7-8ba2-64b621b0c2c7","Type":"ContainerStarted","Data":"503e2ac35b2cb29f79d200576b5f4c19c9e97c546766f821bb1db339e03c2980"} Mar 18 09:08:24 crc kubenswrapper[4778]: I0318 09:08:24.767962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" event={"ID":"a2d5c312-2314-46d7-8ba2-64b621b0c2c7","Type":"ContainerStarted","Data":"58c1742e2f94b007534f301b9155bdf564198ba49891bcec6ce75b6770dc5c77"} Mar 18 09:08:24 crc kubenswrapper[4778]: I0318 09:08:24.768653 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" event={"ID":"a2d5c312-2314-46d7-8ba2-64b621b0c2c7","Type":"ContainerStarted","Data":"f533ae4bb7169f5404c27fbfc262f2519bafba67043e8afea14c0081c5c0e514"} Mar 18 09:08:25 crc kubenswrapper[4778]: I0318 09:08:25.762101 4778 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:25 crc kubenswrapper[4778]: I0318 09:08:25.935730 4778 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="047c238e-63c8-4a7c-9893-575a3a291f40" Mar 18 09:08:26 crc kubenswrapper[4778]: I0318 09:08:26.778950 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:26 crc kubenswrapper[4778]: I0318 09:08:26.779083 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:26 crc kubenswrapper[4778]: I0318 09:08:26.779115 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:26 crc kubenswrapper[4778]: I0318 09:08:26.782550 4778 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="047c238e-63c8-4a7c-9893-575a3a291f40" Mar 18 09:08:26 crc kubenswrapper[4778]: I0318 09:08:26.783318 4778 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://2f198513c437443df041b1f482fe066eaa858c0b9ef973208727b6d4f47d9f0d" Mar 18 09:08:26 crc kubenswrapper[4778]: I0318 09:08:26.783345 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:27 crc kubenswrapper[4778]: I0318 09:08:27.786089 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:27 crc kubenswrapper[4778]: I0318 09:08:27.786608 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:27 crc kubenswrapper[4778]: I0318 09:08:27.792845 4778 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="047c238e-63c8-4a7c-9893-575a3a291f40" Mar 18 09:08:28 crc kubenswrapper[4778]: I0318 09:08:28.133860 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:08:28 crc kubenswrapper[4778]: I0318 09:08:28.793264 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:28 crc kubenswrapper[4778]: I0318 09:08:28.793303 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:28 crc kubenswrapper[4778]: I0318 09:08:28.799622 4778 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="047c238e-63c8-4a7c-9893-575a3a291f40" Mar 18 09:08:29 crc kubenswrapper[4778]: I0318 09:08:29.807922 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:08:29 crc kubenswrapper[4778]: I0318 09:08:29.815027 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:08:36 crc kubenswrapper[4778]: I0318 09:08:36.147482 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 09:08:36 crc kubenswrapper[4778]: I0318 09:08:36.279095 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 09:08:36 crc kubenswrapper[4778]: I0318 09:08:36.390090 4778 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 09:08:36 crc kubenswrapper[4778]: I0318 09:08:36.549248 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 09:08:36 crc kubenswrapper[4778]: I0318 09:08:36.747945 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 09:08:36 crc kubenswrapper[4778]: I0318 09:08:36.759352 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 09:08:37 crc kubenswrapper[4778]: I0318 09:08:37.095961 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 09:08:37 crc kubenswrapper[4778]: I0318 09:08:37.107531 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 09:08:37 crc kubenswrapper[4778]: I0318 09:08:37.405834 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 09:08:37 crc kubenswrapper[4778]: I0318 09:08:37.723029 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 09:08:37 crc kubenswrapper[4778]: I0318 09:08:37.793748 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 09:08:37 crc kubenswrapper[4778]: I0318 09:08:37.812374 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.077914 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.086092 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.132185 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.144462 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.160541 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.190149 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.309470 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.381792 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.591005 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.610173 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.743346 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.847270 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.890535 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.897159 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.939859 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.964678 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.077254 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.222973 4778 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.226392 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.226365902 podStartE2EDuration="36.226365902s" podCreationTimestamp="2026-03-18 09:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:08:25.811572615 +0000 UTC m=+372.386317465" watchObservedRunningTime="2026-03-18 09:08:39.226365902 +0000 UTC m=+385.801110782" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.229638 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9bc7s" podStartSLOduration=315.22962009 podStartE2EDuration="5m15.22962009s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:08:25.907273192 +0000 UTC m=+372.482018052" watchObservedRunningTime="2026-03-18 09:08:39.22962009 +0000 UTC m=+385.804364970" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.231892 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.231972 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.232022 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9bc7s"] Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.235963 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.252001 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.255300 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.276336 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.276301607 podStartE2EDuration="14.276301607s" podCreationTimestamp="2026-03-18 09:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:08:39.266931343 +0000 UTC m=+385.841676193" watchObservedRunningTime="2026-03-18 09:08:39.276301607 +0000 UTC m=+385.851046497" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.409424 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.524144 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.636653 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.643589 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.657755 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.745515 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.767032 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.836962 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.881880 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.920957 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.965772 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.183154 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.319125 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.326981 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.387689 4778 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.428089 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.492432 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.545630 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.559059 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.586945 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.597723 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.598618 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.622871 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.712092 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.733471 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.788766 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.811155 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.906773 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.011127 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.047722 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.112265 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.125038 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.127790 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.213991 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.440661 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.665692 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.747412 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.750644 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.782050 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.782347 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.796191 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.851758 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.867684 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.935822 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.984923 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.015778 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.027409 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.118179 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.174812 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.193377 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.213872 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.261646 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.359289 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.404245 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.413658 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.449550 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.474868 4778 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.498272 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.662653 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.677086 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.713825 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.714188 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.776336 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.917139 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.929972 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.083334 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.092776 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.177412 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.207093 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.241804 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.335733 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.338864 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.396615 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.426449 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.465098 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.491952 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.528540 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.583522 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.604260 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.660102 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.738930 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.739918 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.758946 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.805690 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.891003 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.029661 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.051145 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.053045 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.059860 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.062659 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.101968 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.122803 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.316653 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.341896 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.368468 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.385101 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.597112 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.603891 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.609064 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.642755 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.648167 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.665790 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.732937 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.750065 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.750315 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.949107 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.980969 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.017046 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.031341 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.031548 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.107757 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.110712 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.157711 4778 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.177018 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.274178 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.284186 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.368273 4778 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.410828 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.476777 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.571413 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.580081 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.602441 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.621539 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.698884 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.743316 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.789412 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.824424 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.931762 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.020474 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.156391 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.162790 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.211501 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.264750 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.283381 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.298499 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.439007 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.459839 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.461460 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.465933 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.479287 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.699139 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.705144 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.793408 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.828103 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.862651 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.874606 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.902558 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.043586 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.046510 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.105273 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.134693 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.177447 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.177881 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.189374 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.227697 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.250662 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.259563 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.281009 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.327756 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.399997 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.534747 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.546055 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.709808 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.759771 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.860046 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.881659 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.904263 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.994278 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.067486 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.211231 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.242045 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.245759 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.280152 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.331830 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.400347 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.473396 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.476892 4778 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.477257 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c87368359c8add64535225d3bbfde067e9b4f8b557eb6da52467546b92f78032" gracePeriod=5 Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.608627 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.660090 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.674384 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.682685 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.709180 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.820537 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.841143 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.961505 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.041244 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.134645 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.248876 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.275903 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.420794 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.430000 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.549483 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.622915 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.748638 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.832859 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.873416 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.946551 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.058553 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.137153 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.206160 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.380055 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.630812 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.732746 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.858347 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.882971 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.948608 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.056326 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.063443 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.124734 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.207148 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.236345 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.259639 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.351941 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.393275 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.742828 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.943975 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 09:08:52 crc kubenswrapper[4778]: I0318 09:08:52.547402 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 09:08:52 crc kubenswrapper[4778]: I0318 09:08:52.636500 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 09:08:53 crc kubenswrapper[4778]: I0318 09:08:53.974944 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 09:08:53 crc kubenswrapper[4778]: I0318 09:08:53.975351 4778 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c87368359c8add64535225d3bbfde067e9b4f8b557eb6da52467546b92f78032" exitCode=137 Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.051423 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.051536 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.158692 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.195462 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.207354 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.207392 4778 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8c190e84-de5c-4ee5-9016-8b1ef240b359" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.211220 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.211273 4778 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8c190e84-de5c-4ee5-9016-8b1ef240b359" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.212893 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213328 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213426 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213471 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213476 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213574 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213617 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213735 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.214492 4778 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.214529 4778 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.214547 4778 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.215237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.227179 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.315105 4778 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.315144 4778 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.984406 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.984845 4778 scope.go:117] "RemoveContainer" containerID="c87368359c8add64535225d3bbfde067e9b4f8b557eb6da52467546b92f78032" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.984993 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:56 crc kubenswrapper[4778]: I0318 09:08:56.198007 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 09:09:07 crc kubenswrapper[4778]: I0318 09:09:07.062876 4778 generic.go:334] "Generic (PLEG): container finished" podID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerID="90afc0123cf376859383c4f842d0587910af2413023c90f78391ff8bd752a9d6" exitCode=0 Mar 18 09:09:07 crc kubenswrapper[4778]: I0318 09:09:07.062983 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" event={"ID":"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a","Type":"ContainerDied","Data":"90afc0123cf376859383c4f842d0587910af2413023c90f78391ff8bd752a9d6"} Mar 18 09:09:07 crc kubenswrapper[4778]: I0318 09:09:07.064405 4778 scope.go:117] "RemoveContainer" containerID="90afc0123cf376859383c4f842d0587910af2413023c90f78391ff8bd752a9d6" Mar 18 09:09:08 crc kubenswrapper[4778]: I0318 09:09:08.070935 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" event={"ID":"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a","Type":"ContainerStarted","Data":"f259658053fd1d484054ae2d51122a4e3408bb84a6c07568bf45f82cee5346ab"} Mar 18 09:09:08 crc kubenswrapper[4778]: I0318 09:09:08.071527 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:09:08 crc kubenswrapper[4778]: I0318 09:09:08.073874 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.141046 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563750-lv4gn"] Mar 18 09:10:00 crc kubenswrapper[4778]: E0318 09:10:00.141791 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93723e0d-2243-4390-a667-8a080325205f" containerName="installer" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.141802 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="93723e0d-2243-4390-a667-8a080325205f" containerName="installer" Mar 18 09:10:00 crc kubenswrapper[4778]: E0318 09:10:00.141814 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" containerName="oc" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.141819 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" containerName="oc" Mar 18 09:10:00 crc kubenswrapper[4778]: E0318 09:10:00.141827 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.141832 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.142009 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" containerName="oc" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.142020 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="93723e0d-2243-4390-a667-8a080325205f" containerName="installer" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.142028 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.142516 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.144727 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.147117 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.147229 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.147403 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.149418 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.152266 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563750-lv4gn"] Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.252065 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4btw\" (UniqueName: \"kubernetes.io/projected/105b6b5d-09f6-48c8-862e-c17526c6d6c7-kube-api-access-v4btw\") pod \"auto-csr-approver-29563750-lv4gn\" (UID: \"105b6b5d-09f6-48c8-862e-c17526c6d6c7\") " pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.353829 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4btw\" (UniqueName: \"kubernetes.io/projected/105b6b5d-09f6-48c8-862e-c17526c6d6c7-kube-api-access-v4btw\") pod \"auto-csr-approver-29563750-lv4gn\" (UID: \"105b6b5d-09f6-48c8-862e-c17526c6d6c7\") " pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.385852 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4btw\" (UniqueName: \"kubernetes.io/projected/105b6b5d-09f6-48c8-862e-c17526c6d6c7-kube-api-access-v4btw\") pod \"auto-csr-approver-29563750-lv4gn\" (UID: \"105b6b5d-09f6-48c8-862e-c17526c6d6c7\") " pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.465493 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.720756 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563750-lv4gn"] Mar 18 09:10:01 crc kubenswrapper[4778]: I0318 09:10:01.446901 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" event={"ID":"105b6b5d-09f6-48c8-862e-c17526c6d6c7","Type":"ContainerStarted","Data":"9e7bc36695299640fb2dfac7b49d965a4df1526527c348b335d6ad804b13fb9c"} Mar 18 09:10:02 crc kubenswrapper[4778]: I0318 09:10:02.457356 4778 generic.go:334] "Generic (PLEG): container finished" podID="105b6b5d-09f6-48c8-862e-c17526c6d6c7" containerID="c5bd546fb47bde264ad4459aced4ba49381ccd5bb127c64ac227483b8bb621c0" exitCode=0 Mar 18 09:10:02 crc kubenswrapper[4778]: I0318 09:10:02.457662 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" event={"ID":"105b6b5d-09f6-48c8-862e-c17526c6d6c7","Type":"ContainerDied","Data":"c5bd546fb47bde264ad4459aced4ba49381ccd5bb127c64ac227483b8bb621c0"} Mar 18 09:10:03 crc kubenswrapper[4778]: I0318 09:10:03.818175 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:03 crc kubenswrapper[4778]: I0318 09:10:03.911185 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4btw\" (UniqueName: \"kubernetes.io/projected/105b6b5d-09f6-48c8-862e-c17526c6d6c7-kube-api-access-v4btw\") pod \"105b6b5d-09f6-48c8-862e-c17526c6d6c7\" (UID: \"105b6b5d-09f6-48c8-862e-c17526c6d6c7\") " Mar 18 09:10:03 crc kubenswrapper[4778]: I0318 09:10:03.917421 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/105b6b5d-09f6-48c8-862e-c17526c6d6c7-kube-api-access-v4btw" (OuterVolumeSpecName: "kube-api-access-v4btw") pod "105b6b5d-09f6-48c8-862e-c17526c6d6c7" (UID: "105b6b5d-09f6-48c8-862e-c17526c6d6c7"). InnerVolumeSpecName "kube-api-access-v4btw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:04 crc kubenswrapper[4778]: I0318 09:10:04.013047 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4btw\" (UniqueName: \"kubernetes.io/projected/105b6b5d-09f6-48c8-862e-c17526c6d6c7-kube-api-access-v4btw\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:04 crc kubenswrapper[4778]: I0318 09:10:04.475594 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" event={"ID":"105b6b5d-09f6-48c8-862e-c17526c6d6c7","Type":"ContainerDied","Data":"9e7bc36695299640fb2dfac7b49d965a4df1526527c348b335d6ad804b13fb9c"} Mar 18 09:10:04 crc kubenswrapper[4778]: I0318 09:10:04.475658 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e7bc36695299640fb2dfac7b49d965a4df1526527c348b335d6ad804b13fb9c" Mar 18 09:10:04 crc kubenswrapper[4778]: I0318 09:10:04.475697 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:04 crc kubenswrapper[4778]: I0318 09:10:04.885306 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563744-btdt7"] Mar 18 09:10:04 crc kubenswrapper[4778]: I0318 09:10:04.890589 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563744-btdt7"] Mar 18 09:10:06 crc kubenswrapper[4778]: I0318 09:10:06.200349 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54961f10-93b0-433f-8a7d-b30d69178e9a" path="/var/lib/kubelet/pods/54961f10-93b0-433f-8a7d-b30d69178e9a/volumes" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.725421 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8ch6j"] Mar 18 09:10:21 crc kubenswrapper[4778]: E0318 09:10:21.726256 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="105b6b5d-09f6-48c8-862e-c17526c6d6c7" containerName="oc" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.726275 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="105b6b5d-09f6-48c8-862e-c17526c6d6c7" containerName="oc" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.726411 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="105b6b5d-09f6-48c8-862e-c17526c6d6c7" containerName="oc" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.726880 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.758043 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8ch6j"] Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852141 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg2qq\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-kube-api-access-qg2qq\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852264 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb3770db-238d-457b-ab2b-9fe59806cad4-trusted-ca\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852315 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-registry-tls\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852343 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb3770db-238d-457b-ab2b-9fe59806cad4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852414 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb3770db-238d-457b-ab2b-9fe59806cad4-registry-certificates\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852453 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-bound-sa-token\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852494 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852549 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb3770db-238d-457b-ab2b-9fe59806cad4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.892057 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954540 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb3770db-238d-457b-ab2b-9fe59806cad4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954635 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg2qq\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-kube-api-access-qg2qq\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954685 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb3770db-238d-457b-ab2b-9fe59806cad4-trusted-ca\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb3770db-238d-457b-ab2b-9fe59806cad4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954772 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-registry-tls\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954802 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb3770db-238d-457b-ab2b-9fe59806cad4-registry-certificates\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-bound-sa-token\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.956307 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb3770db-238d-457b-ab2b-9fe59806cad4-registry-certificates\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.956409 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb3770db-238d-457b-ab2b-9fe59806cad4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.957357 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb3770db-238d-457b-ab2b-9fe59806cad4-trusted-ca\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.964607 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-registry-tls\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.969436 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb3770db-238d-457b-ab2b-9fe59806cad4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.972883 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-bound-sa-token\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.986764 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg2qq\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-kube-api-access-qg2qq\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:22 crc kubenswrapper[4778]: I0318 09:10:22.044125 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:22 crc kubenswrapper[4778]: I0318 09:10:22.338799 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8ch6j"] Mar 18 09:10:22 crc kubenswrapper[4778]: I0318 09:10:22.602353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" event={"ID":"bb3770db-238d-457b-ab2b-9fe59806cad4","Type":"ContainerStarted","Data":"7228a1578022fcc6eb924c542ac5792e8e9613cb0ce1327231c26d91fbec5c6d"} Mar 18 09:10:22 crc kubenswrapper[4778]: I0318 09:10:22.602402 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" event={"ID":"bb3770db-238d-457b-ab2b-9fe59806cad4","Type":"ContainerStarted","Data":"06acf5e1c2ce5e33d1c9970d9b6d4eb76f806908244a902017dbb58971091fe9"} Mar 18 09:10:22 crc kubenswrapper[4778]: I0318 09:10:22.603952 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.419700 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" podStartSLOduration=3.419670734 podStartE2EDuration="3.419670734s" podCreationTimestamp="2026-03-18 09:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:10:22.62098434 +0000 UTC m=+489.195729190" watchObservedRunningTime="2026-03-18 09:10:24.419670734 +0000 UTC m=+490.994415614" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.424262 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qvn4w"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.425034 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qvn4w" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="registry-server" containerID="cri-o://70cbb8df67b66047dc2936fa97099c75389be5914cb70774619b80a3c1ca3b41" gracePeriod=30 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.451821 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbbtb"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.452146 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tbbtb" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="registry-server" containerID="cri-o://6d51e3ef30717618dbfd60d2700ddf309051ceaf03fe8bc0561397808fdc4760" gracePeriod=30 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.459031 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hr48"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.459441 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" containerID="cri-o://f259658053fd1d484054ae2d51122a4e3408bb84a6c07568bf45f82cee5346ab" gracePeriod=30 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.475618 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qgm2"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.476659 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6qgm2" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="registry-server" containerID="cri-o://35627094815d8aab35b2d89ade00f4db12d2982f44d18065d3e2b7a8cff620a7" gracePeriod=30 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.484667 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6kvnk"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.485224 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6kvnk" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="registry-server" containerID="cri-o://079c4231aac10f3612c674a07d9729d7a0fd2d54be2669f250d2d9ca937b5439" gracePeriod=30 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.490260 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jj774"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.491997 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.494812 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jj774"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.602336 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7sb4\" (UniqueName: \"kubernetes.io/projected/e037e8cd-1543-49a8-9389-4cc6f440c4b3-kube-api-access-x7sb4\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.602435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e037e8cd-1543-49a8-9389-4cc6f440c4b3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.603056 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e037e8cd-1543-49a8-9389-4cc6f440c4b3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.640327 4778 generic.go:334] "Generic (PLEG): container finished" podID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerID="70cbb8df67b66047dc2936fa97099c75389be5914cb70774619b80a3c1ca3b41" exitCode=0 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.640398 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerDied","Data":"70cbb8df67b66047dc2936fa97099c75389be5914cb70774619b80a3c1ca3b41"} Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.643664 4778 generic.go:334] "Generic (PLEG): container finished" podID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerID="35627094815d8aab35b2d89ade00f4db12d2982f44d18065d3e2b7a8cff620a7" exitCode=0 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.643709 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qgm2" event={"ID":"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47","Type":"ContainerDied","Data":"35627094815d8aab35b2d89ade00f4db12d2982f44d18065d3e2b7a8cff620a7"} Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.645987 4778 generic.go:334] "Generic (PLEG): container finished" podID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerID="f259658053fd1d484054ae2d51122a4e3408bb84a6c07568bf45f82cee5346ab" exitCode=0 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.646043 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" event={"ID":"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a","Type":"ContainerDied","Data":"f259658053fd1d484054ae2d51122a4e3408bb84a6c07568bf45f82cee5346ab"} Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.646076 4778 scope.go:117] "RemoveContainer" containerID="90afc0123cf376859383c4f842d0587910af2413023c90f78391ff8bd752a9d6" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.648988 4778 generic.go:334] "Generic (PLEG): container finished" podID="938982a6-57b0-4870-abed-a98c42196ae6" containerID="079c4231aac10f3612c674a07d9729d7a0fd2d54be2669f250d2d9ca937b5439" exitCode=0 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.649035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerDied","Data":"079c4231aac10f3612c674a07d9729d7a0fd2d54be2669f250d2d9ca937b5439"} Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.651013 4778 generic.go:334] "Generic (PLEG): container finished" podID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerID="6d51e3ef30717618dbfd60d2700ddf309051ceaf03fe8bc0561397808fdc4760" exitCode=0 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.651739 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerDied","Data":"6d51e3ef30717618dbfd60d2700ddf309051ceaf03fe8bc0561397808fdc4760"} Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.704033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e037e8cd-1543-49a8-9389-4cc6f440c4b3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.704140 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7sb4\" (UniqueName: \"kubernetes.io/projected/e037e8cd-1543-49a8-9389-4cc6f440c4b3-kube-api-access-x7sb4\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.704193 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e037e8cd-1543-49a8-9389-4cc6f440c4b3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.705539 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e037e8cd-1543-49a8-9389-4cc6f440c4b3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.716949 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e037e8cd-1543-49a8-9389-4cc6f440c4b3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.724079 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7sb4\" (UniqueName: \"kubernetes.io/projected/e037e8cd-1543-49a8-9389-4cc6f440c4b3-kube-api-access-x7sb4\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.928409 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.938324 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.942775 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.946564 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.952777 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.954182 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.108836 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-operator-metrics\") pod \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.108882 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-utilities\") pod \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.108925 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp6w4\" (UniqueName: \"kubernetes.io/projected/938982a6-57b0-4870-abed-a98c42196ae6-kube-api-access-fp6w4\") pod \"938982a6-57b0-4870-abed-a98c42196ae6\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.108981 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf8tc\" (UniqueName: \"kubernetes.io/projected/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-kube-api-access-qf8tc\") pod \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109009 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-catalog-content\") pod \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109033 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-utilities\") pod \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109056 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcvjk\" (UniqueName: \"kubernetes.io/projected/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-kube-api-access-xcvjk\") pod \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109078 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-catalog-content\") pod \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109124 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjz8f\" (UniqueName: \"kubernetes.io/projected/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-kube-api-access-vjz8f\") pod \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109144 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca\") pod \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109173 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-catalog-content\") pod \"938982a6-57b0-4870-abed-a98c42196ae6\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109213 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-utilities\") pod \"938982a6-57b0-4870-abed-a98c42196ae6\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109241 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-catalog-content\") pod \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109261 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6c42\" (UniqueName: \"kubernetes.io/projected/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-kube-api-access-p6c42\") pod \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109468 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-utilities\") pod \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.113134 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-utilities" (OuterVolumeSpecName: "utilities") pod "57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" (UID: "57f9c6f6-c20e-4e28-aec4-f0104ddb2b47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.116487 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-kube-api-access-p6c42" (OuterVolumeSpecName: "kube-api-access-p6c42") pod "b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" (UID: "b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa"). InnerVolumeSpecName "kube-api-access-p6c42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.116760 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-utilities" (OuterVolumeSpecName: "utilities") pod "938982a6-57b0-4870-abed-a98c42196ae6" (UID: "938982a6-57b0-4870-abed-a98c42196ae6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.116825 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/938982a6-57b0-4870-abed-a98c42196ae6-kube-api-access-fp6w4" (OuterVolumeSpecName: "kube-api-access-fp6w4") pod "938982a6-57b0-4870-abed-a98c42196ae6" (UID: "938982a6-57b0-4870-abed-a98c42196ae6"). InnerVolumeSpecName "kube-api-access-fp6w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.116897 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-utilities" (OuterVolumeSpecName: "utilities") pod "ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" (UID: "ed8eaf37-d7fe-43d1-8d20-fffdd71748cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.117550 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" (UID: "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.117820 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-utilities" (OuterVolumeSpecName: "utilities") pod "b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" (UID: "b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.118158 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-kube-api-access-vjz8f" (OuterVolumeSpecName: "kube-api-access-vjz8f") pod "ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" (UID: "ed8eaf37-d7fe-43d1-8d20-fffdd71748cc"). InnerVolumeSpecName "kube-api-access-vjz8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.118406 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-kube-api-access-xcvjk" (OuterVolumeSpecName: "kube-api-access-xcvjk") pod "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" (UID: "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a"). InnerVolumeSpecName "kube-api-access-xcvjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.118648 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" (UID: "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.118849 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-kube-api-access-qf8tc" (OuterVolumeSpecName: "kube-api-access-qf8tc") pod "57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" (UID: "57f9c6f6-c20e-4e28-aec4-f0104ddb2b47"). InnerVolumeSpecName "kube-api-access-qf8tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.173774 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" (UID: "57f9c6f6-c20e-4e28-aec4-f0104ddb2b47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.194478 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" (UID: "ed8eaf37-d7fe-43d1-8d20-fffdd71748cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.195390 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jj774"] Mar 18 09:10:25 crc kubenswrapper[4778]: W0318 09:10:25.200123 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode037e8cd_1543_49a8_9389_4cc6f440c4b3.slice/crio-c0527933ca28d30c64d338c93a15706da67519748e40cfb8d668bfc5f52809de WatchSource:0}: Error finding container c0527933ca28d30c64d338c93a15706da67519748e40cfb8d668bfc5f52809de: Status 404 returned error can't find the container with id c0527933ca28d30c64d338c93a15706da67519748e40cfb8d668bfc5f52809de Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212076 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp6w4\" (UniqueName: \"kubernetes.io/projected/938982a6-57b0-4870-abed-a98c42196ae6-kube-api-access-fp6w4\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212117 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf8tc\" (UniqueName: \"kubernetes.io/projected/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-kube-api-access-qf8tc\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212136 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212152 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212165 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcvjk\" (UniqueName: \"kubernetes.io/projected/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-kube-api-access-xcvjk\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212175 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjz8f\" (UniqueName: \"kubernetes.io/projected/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-kube-api-access-vjz8f\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212187 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212217 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212228 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212238 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6c42\" (UniqueName: \"kubernetes.io/projected/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-kube-api-access-p6c42\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212249 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212263 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212275 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.214317 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" (UID: "b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.293523 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "938982a6-57b0-4870-abed-a98c42196ae6" (UID: "938982a6-57b0-4870-abed-a98c42196ae6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.313633 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.313677 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.658881 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerDied","Data":"5c2b4b3fdbc29641b0fd4b628d894c39c32fe66c2451fed6064a04a8d6f0eddd"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.660575 4778 scope.go:117] "RemoveContainer" containerID="70cbb8df67b66047dc2936fa97099c75389be5914cb70774619b80a3c1ca3b41" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.658906 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.660609 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qgm2" event={"ID":"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47","Type":"ContainerDied","Data":"a8ae2f2fdacacfb271851ed06f03f49aedc9a4cd93513577d22ad1180d7345ef"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.660657 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.676584 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.676907 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" event={"ID":"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a","Type":"ContainerDied","Data":"44fcaa7d9066c5bc322cc3c475c2c95ffa382825c1c11ff0bdf59ba686b15693"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.682648 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" event={"ID":"e037e8cd-1543-49a8-9389-4cc6f440c4b3","Type":"ContainerStarted","Data":"b29fe1dcf245e93feaf5a46740fb296a8a076e359c3e022be5d2b82ef6b2f575"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.682715 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" event={"ID":"e037e8cd-1543-49a8-9389-4cc6f440c4b3","Type":"ContainerStarted","Data":"c0527933ca28d30c64d338c93a15706da67519748e40cfb8d668bfc5f52809de"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.685735 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerDied","Data":"40f7240695ce3bda0e1e4c2a6151b9a381bd8173f92f88c530036fbcfc0a002f"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.685848 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.691425 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerDied","Data":"a612db4db085f0867c6a8718b7e590183f11921029875dedeb88f07468c98457"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.691579 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.710100 4778 scope.go:117] "RemoveContainer" containerID="206c2187bf0a136c6ff49b69bb1bb6cc918dc7e2a3d9bd6a2d8bac6ce3a51e5f" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.739324 4778 scope.go:117] "RemoveContainer" containerID="9e9b1baa8deb4596f595ec2a830346f2addf7d69c909efa6643ba0c90cdd01c7" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.747861 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qvn4w"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.756330 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qvn4w"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.761446 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" podStartSLOduration=1.76141272 podStartE2EDuration="1.76141272s" podCreationTimestamp="2026-03-18 09:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:10:25.745992171 +0000 UTC m=+492.320737001" watchObservedRunningTime="2026-03-18 09:10:25.76141272 +0000 UTC m=+492.336157560" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.765555 4778 scope.go:117] "RemoveContainer" containerID="35627094815d8aab35b2d89ade00f4db12d2982f44d18065d3e2b7a8cff620a7" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.769936 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qgm2"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.778843 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qgm2"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.793055 4778 scope.go:117] "RemoveContainer" containerID="f49cf5ea04db3604b7012853be48f57eabfbbf2919ff145d883ab1c07e04a460" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.814628 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6kvnk"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.817699 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6kvnk"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.818456 4778 scope.go:117] "RemoveContainer" containerID="113dc27ffd2ebd355aaf8e22c8a148444f799a56c796af33fbc9fe643673da94" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.827034 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbbtb"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.833056 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tbbtb"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.837384 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hr48"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.841011 4778 scope.go:117] "RemoveContainer" containerID="f259658053fd1d484054ae2d51122a4e3408bb84a6c07568bf45f82cee5346ab" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.844629 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hr48"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.861161 4778 scope.go:117] "RemoveContainer" containerID="079c4231aac10f3612c674a07d9729d7a0fd2d54be2669f250d2d9ca937b5439" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.877922 4778 scope.go:117] "RemoveContainer" containerID="d3a1ccac938944a3f089f8c92f9aebbf40c8042f08e26dece0839acbb161f095" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.894696 4778 scope.go:117] "RemoveContainer" containerID="8977456d128ab832e4d2b65a1ebbe275173e48c92b3849c579d8a9cc853d0ce8" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.909801 4778 scope.go:117] "RemoveContainer" containerID="6d51e3ef30717618dbfd60d2700ddf309051ceaf03fe8bc0561397808fdc4760" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.922002 4778 scope.go:117] "RemoveContainer" containerID="080f512015bd3cc96010f06c24c5ff7c172a932d2f9c91b8b4e05d0e6fdb8776" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.939611 4778 scope.go:117] "RemoveContainer" containerID="e143a776ed51bb64025b24b3e1cc128e2a2ca67730b9a34f438ed6857f8be065" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.200653 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" path="/var/lib/kubelet/pods/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47/volumes" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.201910 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="938982a6-57b0-4870-abed-a98c42196ae6" path="/var/lib/kubelet/pods/938982a6-57b0-4870-abed-a98c42196ae6/volumes" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.203080 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" path="/var/lib/kubelet/pods/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa/volumes" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.205056 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" path="/var/lib/kubelet/pods/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc/volumes" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.206307 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" path="/var/lib/kubelet/pods/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a/volumes" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.633695 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xs85d"] Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635053 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635081 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635095 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635103 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635111 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635120 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635128 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635134 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635142 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635148 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635156 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635162 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635169 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635176 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635186 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635209 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635220 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635227 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635236 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635243 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635254 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635260 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635272 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635278 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635287 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635293 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635302 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635310 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635417 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635426 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635436 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635445 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635453 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635634 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.636637 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.638736 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.649671 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs85d"] Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.705112 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.708011 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.733849 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn6s8\" (UniqueName: \"kubernetes.io/projected/0eaac9b5-67d6-4187-b118-0add20190689-kube-api-access-vn6s8\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.733995 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eaac9b5-67d6-4187-b118-0add20190689-catalog-content\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.734088 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eaac9b5-67d6-4187-b118-0add20190689-utilities\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.830870 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9b8p9"] Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.832174 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.835073 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eaac9b5-67d6-4187-b118-0add20190689-utilities\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.835144 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn6s8\" (UniqueName: \"kubernetes.io/projected/0eaac9b5-67d6-4187-b118-0add20190689-kube-api-access-vn6s8\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.835228 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eaac9b5-67d6-4187-b118-0add20190689-catalog-content\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.835370 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.835963 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eaac9b5-67d6-4187-b118-0add20190689-utilities\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.836219 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eaac9b5-67d6-4187-b118-0add20190689-catalog-content\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.848058 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9b8p9"] Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.879724 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn6s8\" (UniqueName: \"kubernetes.io/projected/0eaac9b5-67d6-4187-b118-0add20190689-kube-api-access-vn6s8\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.936836 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv59s\" (UniqueName: \"kubernetes.io/projected/f9a557a7-2d98-4e56-8119-acfd64357871-kube-api-access-kv59s\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.936912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a557a7-2d98-4e56-8119-acfd64357871-catalog-content\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.936936 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a557a7-2d98-4e56-8119-acfd64357871-utilities\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.969688 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.038104 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a557a7-2d98-4e56-8119-acfd64357871-catalog-content\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.038163 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a557a7-2d98-4e56-8119-acfd64357871-utilities\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.038250 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv59s\" (UniqueName: \"kubernetes.io/projected/f9a557a7-2d98-4e56-8119-acfd64357871-kube-api-access-kv59s\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.039043 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a557a7-2d98-4e56-8119-acfd64357871-utilities\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.040669 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a557a7-2d98-4e56-8119-acfd64357871-catalog-content\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.070246 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv59s\" (UniqueName: \"kubernetes.io/projected/f9a557a7-2d98-4e56-8119-acfd64357871-kube-api-access-kv59s\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.152097 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.201870 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs85d"] Mar 18 09:10:27 crc kubenswrapper[4778]: W0318 09:10:27.212050 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eaac9b5_67d6_4187_b118_0add20190689.slice/crio-7839735294d147cf4ba43aa4d7887579df743a8b71976c9abcde9921692c3035 WatchSource:0}: Error finding container 7839735294d147cf4ba43aa4d7887579df743a8b71976c9abcde9921692c3035: Status 404 returned error can't find the container with id 7839735294d147cf4ba43aa4d7887579df743a8b71976c9abcde9921692c3035 Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.383231 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9b8p9"] Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.724715 4778 generic.go:334] "Generic (PLEG): container finished" podID="0eaac9b5-67d6-4187-b118-0add20190689" containerID="f388982a6381655f0b5c5f2a3da5b6d1b9bef2d39ef797f57925c20b9817472c" exitCode=0 Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.725516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs85d" event={"ID":"0eaac9b5-67d6-4187-b118-0add20190689","Type":"ContainerDied","Data":"f388982a6381655f0b5c5f2a3da5b6d1b9bef2d39ef797f57925c20b9817472c"} Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.725665 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs85d" event={"ID":"0eaac9b5-67d6-4187-b118-0add20190689","Type":"ContainerStarted","Data":"7839735294d147cf4ba43aa4d7887579df743a8b71976c9abcde9921692c3035"} Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.730944 4778 generic.go:334] "Generic (PLEG): container finished" podID="f9a557a7-2d98-4e56-8119-acfd64357871" containerID="e29ef92821e2044968655d435177477cc2b9361ff5419d7bccfdd357ea22baa2" exitCode=0 Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.731899 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b8p9" event={"ID":"f9a557a7-2d98-4e56-8119-acfd64357871","Type":"ContainerDied","Data":"e29ef92821e2044968655d435177477cc2b9361ff5419d7bccfdd357ea22baa2"} Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.731934 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b8p9" event={"ID":"f9a557a7-2d98-4e56-8119-acfd64357871","Type":"ContainerStarted","Data":"cb95bef02ed3ea7cd6da9b563de2430ccb225bf3ee2ffbe0ba35413f02b45525"} Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.055161 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-csm2z"] Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.057624 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.069498 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.073456 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-csm2z"] Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.173034 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-catalog-content\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.173093 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-utilities\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.173583 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w6qv\" (UniqueName: \"kubernetes.io/projected/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-kube-api-access-9w6qv\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.241322 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ktcxn"] Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.247099 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.254828 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.263713 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ktcxn"] Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.275849 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-catalog-content\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.275908 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-utilities\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.275979 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w6qv\" (UniqueName: \"kubernetes.io/projected/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-kube-api-access-9w6qv\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.276693 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-catalog-content\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.276737 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-utilities\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.310870 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w6qv\" (UniqueName: \"kubernetes.io/projected/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-kube-api-access-9w6qv\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.377995 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-catalog-content\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.378048 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s9q2\" (UniqueName: \"kubernetes.io/projected/fee87709-f8ed-4eb4-829e-1fdb6534bb35-kube-api-access-5s9q2\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.378128 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-utilities\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.404290 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.479148 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-utilities\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.479243 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-catalog-content\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.479264 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s9q2\" (UniqueName: \"kubernetes.io/projected/fee87709-f8ed-4eb4-829e-1fdb6534bb35-kube-api-access-5s9q2\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.479685 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-utilities\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.479717 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-catalog-content\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.514474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s9q2\" (UniqueName: \"kubernetes.io/projected/fee87709-f8ed-4eb4-829e-1fdb6534bb35-kube-api-access-5s9q2\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.574165 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.649437 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-csm2z"] Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.745365 4778 generic.go:334] "Generic (PLEG): container finished" podID="0eaac9b5-67d6-4187-b118-0add20190689" containerID="954929f7d2d9273de039af00dbe8ecaba4e10ae2e0ce60b95735a16f0a6ce1d7" exitCode=0 Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.745439 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs85d" event={"ID":"0eaac9b5-67d6-4187-b118-0add20190689","Type":"ContainerDied","Data":"954929f7d2d9273de039af00dbe8ecaba4e10ae2e0ce60b95735a16f0a6ce1d7"} Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.753086 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b8p9" event={"ID":"f9a557a7-2d98-4e56-8119-acfd64357871","Type":"ContainerStarted","Data":"b507290e4000ed980fb3ccd54bd103fde51b3f601acdd8010eaa6ec4edea7112"} Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.757715 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csm2z" event={"ID":"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3","Type":"ContainerStarted","Data":"1eb15b9c7e341bdbdafffdf98bfb8563afc6aafb0c6d1d7be9705c2b12903958"} Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.760792 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ktcxn"] Mar 18 09:10:29 crc kubenswrapper[4778]: W0318 09:10:29.773093 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee87709_f8ed_4eb4_829e_1fdb6534bb35.slice/crio-1a86626e5d4576d55c9f62a59074b0761782b73c69b053e7829507e68652f471 WatchSource:0}: Error finding container 1a86626e5d4576d55c9f62a59074b0761782b73c69b053e7829507e68652f471: Status 404 returned error can't find the container with id 1a86626e5d4576d55c9f62a59074b0761782b73c69b053e7829507e68652f471 Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.147518 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.149312 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.766759 4778 generic.go:334] "Generic (PLEG): container finished" podID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerID="6ecbe80389c09da7c5dfaf24f572df1adb64cba289f74a3e8339845f8cebe749" exitCode=0 Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.766842 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktcxn" event={"ID":"fee87709-f8ed-4eb4-829e-1fdb6534bb35","Type":"ContainerDied","Data":"6ecbe80389c09da7c5dfaf24f572df1adb64cba289f74a3e8339845f8cebe749"} Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.766879 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktcxn" event={"ID":"fee87709-f8ed-4eb4-829e-1fdb6534bb35","Type":"ContainerStarted","Data":"1a86626e5d4576d55c9f62a59074b0761782b73c69b053e7829507e68652f471"} Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.770463 4778 generic.go:334] "Generic (PLEG): container finished" podID="83efc97a-1a91-4bc8-90bf-a78bc8ee90e3" containerID="001f9df6918d35a48d43c771d16934b8d15d4706ddeb0d1de385aab2a15e503e" exitCode=0 Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.770543 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csm2z" event={"ID":"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3","Type":"ContainerDied","Data":"001f9df6918d35a48d43c771d16934b8d15d4706ddeb0d1de385aab2a15e503e"} Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.773233 4778 generic.go:334] "Generic (PLEG): container finished" podID="f9a557a7-2d98-4e56-8119-acfd64357871" containerID="b507290e4000ed980fb3ccd54bd103fde51b3f601acdd8010eaa6ec4edea7112" exitCode=0 Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.773291 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b8p9" event={"ID":"f9a557a7-2d98-4e56-8119-acfd64357871","Type":"ContainerDied","Data":"b507290e4000ed980fb3ccd54bd103fde51b3f601acdd8010eaa6ec4edea7112"} Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.778536 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs85d" event={"ID":"0eaac9b5-67d6-4187-b118-0add20190689","Type":"ContainerStarted","Data":"951eb8ce92ce0a97415382e79f2feb2fe4e10fb2df66c4882bfbffa91d09e171"} Mar 18 09:10:31 crc kubenswrapper[4778]: I0318 09:10:31.789273 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b8p9" event={"ID":"f9a557a7-2d98-4e56-8119-acfd64357871","Type":"ContainerStarted","Data":"98686129a658886ec0480e792b75f96d368f1fb9b7723c5a0396f19c73eb8f4f"} Mar 18 09:10:31 crc kubenswrapper[4778]: I0318 09:10:31.810424 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xs85d" podStartSLOduration=3.374203237 podStartE2EDuration="5.810395449s" podCreationTimestamp="2026-03-18 09:10:26 +0000 UTC" firstStartedPulling="2026-03-18 09:10:27.726737726 +0000 UTC m=+494.301482566" lastFinishedPulling="2026-03-18 09:10:30.162929938 +0000 UTC m=+496.737674778" observedRunningTime="2026-03-18 09:10:30.86581377 +0000 UTC m=+497.440558640" watchObservedRunningTime="2026-03-18 09:10:31.810395449 +0000 UTC m=+498.385140299" Mar 18 09:10:31 crc kubenswrapper[4778]: I0318 09:10:31.812023 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9b8p9" podStartSLOduration=2.154506702 podStartE2EDuration="5.812012673s" podCreationTimestamp="2026-03-18 09:10:26 +0000 UTC" firstStartedPulling="2026-03-18 09:10:27.73317439 +0000 UTC m=+494.307919230" lastFinishedPulling="2026-03-18 09:10:31.390680321 +0000 UTC m=+497.965425201" observedRunningTime="2026-03-18 09:10:31.805475586 +0000 UTC m=+498.380220436" watchObservedRunningTime="2026-03-18 09:10:31.812012673 +0000 UTC m=+498.386757533" Mar 18 09:10:32 crc kubenswrapper[4778]: I0318 09:10:32.797593 4778 generic.go:334] "Generic (PLEG): container finished" podID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerID="d37142aca8df005734457524dffa32c4483716edffbcfb2d1b92b3701d6e7e1c" exitCode=0 Mar 18 09:10:32 crc kubenswrapper[4778]: I0318 09:10:32.797717 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktcxn" event={"ID":"fee87709-f8ed-4eb4-829e-1fdb6534bb35","Type":"ContainerDied","Data":"d37142aca8df005734457524dffa32c4483716edffbcfb2d1b92b3701d6e7e1c"} Mar 18 09:10:33 crc kubenswrapper[4778]: I0318 09:10:33.809880 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktcxn" event={"ID":"fee87709-f8ed-4eb4-829e-1fdb6534bb35","Type":"ContainerStarted","Data":"0d5497da92a3a6b067e66da6b34d1f0c05c8fc0ce853a92ab83f4966e6f9359e"} Mar 18 09:10:33 crc kubenswrapper[4778]: I0318 09:10:33.813889 4778 generic.go:334] "Generic (PLEG): container finished" podID="83efc97a-1a91-4bc8-90bf-a78bc8ee90e3" containerID="6fe6c8c9a13bac0fd88ed849c73dcfa6711eb3a6b0e4dbc9785b0295e53c5d5b" exitCode=0 Mar 18 09:10:33 crc kubenswrapper[4778]: I0318 09:10:33.813957 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csm2z" event={"ID":"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3","Type":"ContainerDied","Data":"6fe6c8c9a13bac0fd88ed849c73dcfa6711eb3a6b0e4dbc9785b0295e53c5d5b"} Mar 18 09:10:33 crc kubenswrapper[4778]: I0318 09:10:33.839071 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ktcxn" podStartSLOduration=2.233017312 podStartE2EDuration="4.839042243s" podCreationTimestamp="2026-03-18 09:10:29 +0000 UTC" firstStartedPulling="2026-03-18 09:10:30.769107265 +0000 UTC m=+497.343852105" lastFinishedPulling="2026-03-18 09:10:33.375132176 +0000 UTC m=+499.949877036" observedRunningTime="2026-03-18 09:10:33.836782122 +0000 UTC m=+500.411526972" watchObservedRunningTime="2026-03-18 09:10:33.839042243 +0000 UTC m=+500.413787083" Mar 18 09:10:34 crc kubenswrapper[4778]: I0318 09:10:34.823935 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csm2z" event={"ID":"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3","Type":"ContainerStarted","Data":"dc3a0df91d3078511e57a266f8dc3cdf826fb116b6d9f3cbca75b54bd4b09de4"} Mar 18 09:10:34 crc kubenswrapper[4778]: I0318 09:10:34.850538 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-csm2z" podStartSLOduration=2.402036369 podStartE2EDuration="5.850509868s" podCreationTimestamp="2026-03-18 09:10:29 +0000 UTC" firstStartedPulling="2026-03-18 09:10:30.77221973 +0000 UTC m=+497.346964570" lastFinishedPulling="2026-03-18 09:10:34.220693209 +0000 UTC m=+500.795438069" observedRunningTime="2026-03-18 09:10:34.846436478 +0000 UTC m=+501.421181338" watchObservedRunningTime="2026-03-18 09:10:34.850509868 +0000 UTC m=+501.425254708" Mar 18 09:10:36 crc kubenswrapper[4778]: I0318 09:10:36.969930 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:36 crc kubenswrapper[4778]: I0318 09:10:36.970416 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:37 crc kubenswrapper[4778]: I0318 09:10:37.033476 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:37 crc kubenswrapper[4778]: I0318 09:10:37.154169 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:37 crc kubenswrapper[4778]: I0318 09:10:37.154268 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:37 crc kubenswrapper[4778]: I0318 09:10:37.916412 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:38 crc kubenswrapper[4778]: I0318 09:10:38.200546 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9b8p9" podUID="f9a557a7-2d98-4e56-8119-acfd64357871" containerName="registry-server" probeResult="failure" output=< Mar 18 09:10:38 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:10:38 crc kubenswrapper[4778]: > Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.404667 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.404722 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.459602 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.575456 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.575542 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.630828 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.917364 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.918337 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:42 crc kubenswrapper[4778]: I0318 09:10:42.053533 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:42 crc kubenswrapper[4778]: I0318 09:10:42.146041 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nkbq4"] Mar 18 09:10:47 crc kubenswrapper[4778]: I0318 09:10:47.225257 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:47 crc kubenswrapper[4778]: I0318 09:10:47.280576 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:11:00 crc kubenswrapper[4778]: I0318 09:11:00.147554 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:11:00 crc kubenswrapper[4778]: I0318 09:11:00.148265 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:11:00 crc kubenswrapper[4778]: I0318 09:11:00.148314 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:11:00 crc kubenswrapper[4778]: I0318 09:11:00.149109 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34445970a34bfac9ea81483ad63bd182faceb7cf5f4517903b0d318af28a7e07"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:11:00 crc kubenswrapper[4778]: I0318 09:11:00.149217 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://34445970a34bfac9ea81483ad63bd182faceb7cf5f4517903b0d318af28a7e07" gracePeriod=600 Mar 18 09:11:01 crc kubenswrapper[4778]: I0318 09:11:01.013122 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="34445970a34bfac9ea81483ad63bd182faceb7cf5f4517903b0d318af28a7e07" exitCode=0 Mar 18 09:11:01 crc kubenswrapper[4778]: I0318 09:11:01.013268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"34445970a34bfac9ea81483ad63bd182faceb7cf5f4517903b0d318af28a7e07"} Mar 18 09:11:01 crc kubenswrapper[4778]: I0318 09:11:01.014250 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"8352df71b83dce7216f889f9357e2f46d95d02444a6764ed6ea4a1748338a832"} Mar 18 09:11:01 crc kubenswrapper[4778]: I0318 09:11:01.014291 4778 scope.go:117] "RemoveContainer" containerID="cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.205678 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" podUID="2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" containerName="registry" containerID="cri-o://a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b" gracePeriod=30 Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.610704 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.773891 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-tls\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774375 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774490 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-trusted-ca\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774597 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-certificates\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774639 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-bound-sa-token\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774692 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsq4q\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-kube-api-access-fsq4q\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774742 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-installation-pull-secrets\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774805 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-ca-trust-extracted\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.777249 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.777739 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.783006 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.784748 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-kube-api-access-fsq4q" (OuterVolumeSpecName: "kube-api-access-fsq4q") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "kube-api-access-fsq4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.785170 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.785474 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.791479 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.799425 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876650 4778 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876695 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876708 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876721 4778 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876736 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsq4q\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-kube-api-access-fsq4q\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876748 4778 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876780 4778 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.075708 4778 generic.go:334] "Generic (PLEG): container finished" podID="2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" containerID="a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b" exitCode=0 Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.075754 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.075770 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" event={"ID":"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad","Type":"ContainerDied","Data":"a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b"} Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.076632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" event={"ID":"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad","Type":"ContainerDied","Data":"c662bfec2aa4a3db7809425b04cf847a1727f28e0c070e4d7298a20004e2533f"} Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.076711 4778 scope.go:117] "RemoveContainer" containerID="a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b" Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.111807 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nkbq4"] Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.117351 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nkbq4"] Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.123488 4778 scope.go:117] "RemoveContainer" containerID="a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b" Mar 18 09:11:08 crc kubenswrapper[4778]: E0318 09:11:08.124342 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b\": container with ID starting with a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b not found: ID does not exist" containerID="a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b" Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.124405 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b"} err="failed to get container status \"a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b\": rpc error: code = NotFound desc = could not find container \"a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b\": container with ID starting with a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b not found: ID does not exist" Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.195087 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" path="/var/lib/kubelet/pods/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad/volumes" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.135755 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563752-h72kq"] Mar 18 09:12:00 crc kubenswrapper[4778]: E0318 09:12:00.136581 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" containerName="registry" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.136597 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" containerName="registry" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.136737 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" containerName="registry" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.137359 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.144154 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.144380 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.149599 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.157582 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563752-h72kq"] Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.256353 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5vng\" (UniqueName: \"kubernetes.io/projected/57e614a6-a447-41bc-b7c8-034610af7d59-kube-api-access-p5vng\") pod \"auto-csr-approver-29563752-h72kq\" (UID: \"57e614a6-a447-41bc-b7c8-034610af7d59\") " pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.358295 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5vng\" (UniqueName: \"kubernetes.io/projected/57e614a6-a447-41bc-b7c8-034610af7d59-kube-api-access-p5vng\") pod \"auto-csr-approver-29563752-h72kq\" (UID: \"57e614a6-a447-41bc-b7c8-034610af7d59\") " pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.383603 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5vng\" (UniqueName: \"kubernetes.io/projected/57e614a6-a447-41bc-b7c8-034610af7d59-kube-api-access-p5vng\") pod \"auto-csr-approver-29563752-h72kq\" (UID: \"57e614a6-a447-41bc-b7c8-034610af7d59\") " pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.469036 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.720861 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563752-h72kq"] Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.724597 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:12:01 crc kubenswrapper[4778]: I0318 09:12:01.468260 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563752-h72kq" event={"ID":"57e614a6-a447-41bc-b7c8-034610af7d59","Type":"ContainerStarted","Data":"50988f6267519097811ac791abc59f70ec8c4ac956672a4ddaf299b87d47e582"} Mar 18 09:12:02 crc kubenswrapper[4778]: I0318 09:12:02.479027 4778 generic.go:334] "Generic (PLEG): container finished" podID="57e614a6-a447-41bc-b7c8-034610af7d59" containerID="6614d11a5de4463d54d3a021b1144b715f14eddffa1ef95f83bb20fa8f58ca90" exitCode=0 Mar 18 09:12:02 crc kubenswrapper[4778]: I0318 09:12:02.479093 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563752-h72kq" event={"ID":"57e614a6-a447-41bc-b7c8-034610af7d59","Type":"ContainerDied","Data":"6614d11a5de4463d54d3a021b1144b715f14eddffa1ef95f83bb20fa8f58ca90"} Mar 18 09:12:03 crc kubenswrapper[4778]: I0318 09:12:03.805047 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:03 crc kubenswrapper[4778]: I0318 09:12:03.938761 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5vng\" (UniqueName: \"kubernetes.io/projected/57e614a6-a447-41bc-b7c8-034610af7d59-kube-api-access-p5vng\") pod \"57e614a6-a447-41bc-b7c8-034610af7d59\" (UID: \"57e614a6-a447-41bc-b7c8-034610af7d59\") " Mar 18 09:12:03 crc kubenswrapper[4778]: I0318 09:12:03.945497 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e614a6-a447-41bc-b7c8-034610af7d59-kube-api-access-p5vng" (OuterVolumeSpecName: "kube-api-access-p5vng") pod "57e614a6-a447-41bc-b7c8-034610af7d59" (UID: "57e614a6-a447-41bc-b7c8-034610af7d59"). InnerVolumeSpecName "kube-api-access-p5vng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:12:04 crc kubenswrapper[4778]: I0318 09:12:04.040809 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5vng\" (UniqueName: \"kubernetes.io/projected/57e614a6-a447-41bc-b7c8-034610af7d59-kube-api-access-p5vng\") on node \"crc\" DevicePath \"\"" Mar 18 09:12:04 crc kubenswrapper[4778]: I0318 09:12:04.499394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563752-h72kq" event={"ID":"57e614a6-a447-41bc-b7c8-034610af7d59","Type":"ContainerDied","Data":"50988f6267519097811ac791abc59f70ec8c4ac956672a4ddaf299b87d47e582"} Mar 18 09:12:04 crc kubenswrapper[4778]: I0318 09:12:04.499471 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50988f6267519097811ac791abc59f70ec8c4ac956672a4ddaf299b87d47e582" Mar 18 09:12:04 crc kubenswrapper[4778]: I0318 09:12:04.499574 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:04 crc kubenswrapper[4778]: I0318 09:12:04.873615 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563746-b66f7"] Mar 18 09:12:04 crc kubenswrapper[4778]: I0318 09:12:04.879736 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563746-b66f7"] Mar 18 09:12:06 crc kubenswrapper[4778]: I0318 09:12:06.193917 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3be356e-94af-47db-a182-dd8a57024619" path="/var/lib/kubelet/pods/c3be356e-94af-47db-a182-dd8a57024619/volumes" Mar 18 09:13:00 crc kubenswrapper[4778]: I0318 09:13:00.147796 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:13:00 crc kubenswrapper[4778]: I0318 09:13:00.148946 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:13:19 crc kubenswrapper[4778]: I0318 09:13:19.405110 4778 scope.go:117] "RemoveContainer" containerID="8b9319e52264a71946e18a681c32dbd6ffc04e6afcc03a59b8bfa719c7422b7f" Mar 18 09:13:19 crc kubenswrapper[4778]: I0318 09:13:19.433283 4778 scope.go:117] "RemoveContainer" containerID="44d1d0b1ecaf0bd45db18a8ca3c0502c00748ea75b870a51131c12eecf1aa1f8" Mar 18 09:13:30 crc kubenswrapper[4778]: I0318 09:13:30.147977 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:13:30 crc kubenswrapper[4778]: I0318 09:13:30.148587 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.138742 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563754-8p782"] Mar 18 09:14:00 crc kubenswrapper[4778]: E0318 09:14:00.139659 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e614a6-a447-41bc-b7c8-034610af7d59" containerName="oc" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.139676 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e614a6-a447-41bc-b7c8-034610af7d59" containerName="oc" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.139823 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e614a6-a447-41bc-b7c8-034610af7d59" containerName="oc" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.140384 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.148985 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.149095 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.149166 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.149519 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.149547 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.149633 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.150004 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8352df71b83dce7216f889f9357e2f46d95d02444a6764ed6ea4a1748338a832"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.150068 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://8352df71b83dce7216f889f9357e2f46d95d02444a6764ed6ea4a1748338a832" gracePeriod=600 Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.160696 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563754-8p782"] Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.243228 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6mfj\" (UniqueName: \"kubernetes.io/projected/913fd7d5-c271-4918-992c-95e6048faa85-kube-api-access-b6mfj\") pod \"auto-csr-approver-29563754-8p782\" (UID: \"913fd7d5-c271-4918-992c-95e6048faa85\") " pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.330670 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="8352df71b83dce7216f889f9357e2f46d95d02444a6764ed6ea4a1748338a832" exitCode=0 Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.330732 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"8352df71b83dce7216f889f9357e2f46d95d02444a6764ed6ea4a1748338a832"} Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.331279 4778 scope.go:117] "RemoveContainer" containerID="34445970a34bfac9ea81483ad63bd182faceb7cf5f4517903b0d318af28a7e07" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.345220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6mfj\" (UniqueName: \"kubernetes.io/projected/913fd7d5-c271-4918-992c-95e6048faa85-kube-api-access-b6mfj\") pod \"auto-csr-approver-29563754-8p782\" (UID: \"913fd7d5-c271-4918-992c-95e6048faa85\") " pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.369413 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6mfj\" (UniqueName: \"kubernetes.io/projected/913fd7d5-c271-4918-992c-95e6048faa85-kube-api-access-b6mfj\") pod \"auto-csr-approver-29563754-8p782\" (UID: \"913fd7d5-c271-4918-992c-95e6048faa85\") " pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.478149 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.915924 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563754-8p782"] Mar 18 09:14:01 crc kubenswrapper[4778]: I0318 09:14:01.341232 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"2bce801303fcc67febe135eadcc45500477ce5c50a4d9315c95b48aad6bc9b1d"} Mar 18 09:14:01 crc kubenswrapper[4778]: I0318 09:14:01.342660 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563754-8p782" event={"ID":"913fd7d5-c271-4918-992c-95e6048faa85","Type":"ContainerStarted","Data":"c4808469037255cf3739a39f462f2935e1242030d6b9156f4361ab4e5be94e70"} Mar 18 09:14:02 crc kubenswrapper[4778]: I0318 09:14:02.354494 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563754-8p782" event={"ID":"913fd7d5-c271-4918-992c-95e6048faa85","Type":"ContainerStarted","Data":"623e10ff390eb7e19703ecca951ddfb9b57c997906e5eae908a2c7aedc17d0d1"} Mar 18 09:14:02 crc kubenswrapper[4778]: I0318 09:14:02.377816 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563754-8p782" podStartSLOduration=1.417824869 podStartE2EDuration="2.377785821s" podCreationTimestamp="2026-03-18 09:14:00 +0000 UTC" firstStartedPulling="2026-03-18 09:14:00.92342652 +0000 UTC m=+707.498171360" lastFinishedPulling="2026-03-18 09:14:01.883387442 +0000 UTC m=+708.458132312" observedRunningTime="2026-03-18 09:14:02.375683863 +0000 UTC m=+708.950428763" watchObservedRunningTime="2026-03-18 09:14:02.377785821 +0000 UTC m=+708.952530711" Mar 18 09:14:03 crc kubenswrapper[4778]: I0318 09:14:03.364323 4778 generic.go:334] "Generic (PLEG): container finished" podID="913fd7d5-c271-4918-992c-95e6048faa85" containerID="623e10ff390eb7e19703ecca951ddfb9b57c997906e5eae908a2c7aedc17d0d1" exitCode=0 Mar 18 09:14:03 crc kubenswrapper[4778]: I0318 09:14:03.364391 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563754-8p782" event={"ID":"913fd7d5-c271-4918-992c-95e6048faa85","Type":"ContainerDied","Data":"623e10ff390eb7e19703ecca951ddfb9b57c997906e5eae908a2c7aedc17d0d1"} Mar 18 09:14:04 crc kubenswrapper[4778]: I0318 09:14:04.689255 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:04 crc kubenswrapper[4778]: I0318 09:14:04.740263 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6mfj\" (UniqueName: \"kubernetes.io/projected/913fd7d5-c271-4918-992c-95e6048faa85-kube-api-access-b6mfj\") pod \"913fd7d5-c271-4918-992c-95e6048faa85\" (UID: \"913fd7d5-c271-4918-992c-95e6048faa85\") " Mar 18 09:14:04 crc kubenswrapper[4778]: I0318 09:14:04.747045 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/913fd7d5-c271-4918-992c-95e6048faa85-kube-api-access-b6mfj" (OuterVolumeSpecName: "kube-api-access-b6mfj") pod "913fd7d5-c271-4918-992c-95e6048faa85" (UID: "913fd7d5-c271-4918-992c-95e6048faa85"). InnerVolumeSpecName "kube-api-access-b6mfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:14:04 crc kubenswrapper[4778]: I0318 09:14:04.842973 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6mfj\" (UniqueName: \"kubernetes.io/projected/913fd7d5-c271-4918-992c-95e6048faa85-kube-api-access-b6mfj\") on node \"crc\" DevicePath \"\"" Mar 18 09:14:05 crc kubenswrapper[4778]: I0318 09:14:05.383974 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563754-8p782" event={"ID":"913fd7d5-c271-4918-992c-95e6048faa85","Type":"ContainerDied","Data":"c4808469037255cf3739a39f462f2935e1242030d6b9156f4361ab4e5be94e70"} Mar 18 09:14:05 crc kubenswrapper[4778]: I0318 09:14:05.384035 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:05 crc kubenswrapper[4778]: I0318 09:14:05.384050 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4808469037255cf3739a39f462f2935e1242030d6b9156f4361ab4e5be94e70" Mar 18 09:14:05 crc kubenswrapper[4778]: I0318 09:14:05.447518 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563748-8q2hs"] Mar 18 09:14:05 crc kubenswrapper[4778]: I0318 09:14:05.451087 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563748-8q2hs"] Mar 18 09:14:06 crc kubenswrapper[4778]: I0318 09:14:06.199485 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" path="/var/lib/kubelet/pods/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126/volumes" Mar 18 09:14:19 crc kubenswrapper[4778]: I0318 09:14:19.490390 4778 scope.go:117] "RemoveContainer" containerID="de8929b794bb5d2a7228965e1493c7e14a2590362f85b344f68eb15fc61bd4bf" Mar 18 09:14:19 crc kubenswrapper[4778]: I0318 09:14:19.516583 4778 scope.go:117] "RemoveContainer" containerID="2aaa498f349b23cf7a4f0fb9da41ba553f76ed88636548c40f4f1cf1a8220b22" Mar 18 09:14:19 crc kubenswrapper[4778]: I0318 09:14:19.565571 4778 scope.go:117] "RemoveContainer" containerID="cf4a9ddbe48af9c3f976ba168fa13253c79814734a6e5e0e3ef5fa348e79df80" Mar 18 09:14:19 crc kubenswrapper[4778]: I0318 09:14:19.618825 4778 scope.go:117] "RemoveContainer" containerID="6182c0f78e6f784c954f3d716e1ee189ef539115eb9b295b92f5ba494516c3c6" Mar 18 09:14:19 crc kubenswrapper[4778]: I0318 09:14:19.640945 4778 scope.go:117] "RemoveContainer" containerID="8bac8cffd4e1a60ba2bf0f1dd076bc9102bdfd1bb1f8e85a389ddde5e582bd3e" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.149913 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6"] Mar 18 09:15:00 crc kubenswrapper[4778]: E0318 09:15:00.150859 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913fd7d5-c271-4918-992c-95e6048faa85" containerName="oc" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.150879 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="913fd7d5-c271-4918-992c-95e6048faa85" containerName="oc" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.151021 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="913fd7d5-c271-4918-992c-95e6048faa85" containerName="oc" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.151546 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.154697 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.154717 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.170981 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6"] Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.212466 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea72845-4b27-4381-b08b-e0570c67bddb-config-volume\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.212568 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n5nc\" (UniqueName: \"kubernetes.io/projected/bea72845-4b27-4381-b08b-e0570c67bddb-kube-api-access-8n5nc\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.212621 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea72845-4b27-4381-b08b-e0570c67bddb-secret-volume\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.313738 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea72845-4b27-4381-b08b-e0570c67bddb-secret-volume\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.313809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea72845-4b27-4381-b08b-e0570c67bddb-config-volume\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.313875 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n5nc\" (UniqueName: \"kubernetes.io/projected/bea72845-4b27-4381-b08b-e0570c67bddb-kube-api-access-8n5nc\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.314973 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea72845-4b27-4381-b08b-e0570c67bddb-config-volume\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.325516 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea72845-4b27-4381-b08b-e0570c67bddb-secret-volume\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.347000 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n5nc\" (UniqueName: \"kubernetes.io/projected/bea72845-4b27-4381-b08b-e0570c67bddb-kube-api-access-8n5nc\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.481777 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.732994 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6"] Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.829761 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" event={"ID":"bea72845-4b27-4381-b08b-e0570c67bddb","Type":"ContainerStarted","Data":"1680f1b14b94b34a9719dba11e6814c744db638f9e5b99a93b4ed4014f924dba"} Mar 18 09:15:01 crc kubenswrapper[4778]: I0318 09:15:01.839406 4778 generic.go:334] "Generic (PLEG): container finished" podID="bea72845-4b27-4381-b08b-e0570c67bddb" containerID="eaf004108fc735124a6750b445bf1e3f7676efb1a3da3a71036d9a0909c64710" exitCode=0 Mar 18 09:15:01 crc kubenswrapper[4778]: I0318 09:15:01.839492 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" event={"ID":"bea72845-4b27-4381-b08b-e0570c67bddb","Type":"ContainerDied","Data":"eaf004108fc735124a6750b445bf1e3f7676efb1a3da3a71036d9a0909c64710"} Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.105165 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.253969 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea72845-4b27-4381-b08b-e0570c67bddb-config-volume\") pod \"bea72845-4b27-4381-b08b-e0570c67bddb\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.254326 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n5nc\" (UniqueName: \"kubernetes.io/projected/bea72845-4b27-4381-b08b-e0570c67bddb-kube-api-access-8n5nc\") pod \"bea72845-4b27-4381-b08b-e0570c67bddb\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.254368 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea72845-4b27-4381-b08b-e0570c67bddb-secret-volume\") pod \"bea72845-4b27-4381-b08b-e0570c67bddb\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.255123 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea72845-4b27-4381-b08b-e0570c67bddb-config-volume" (OuterVolumeSpecName: "config-volume") pod "bea72845-4b27-4381-b08b-e0570c67bddb" (UID: "bea72845-4b27-4381-b08b-e0570c67bddb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.259803 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bea72845-4b27-4381-b08b-e0570c67bddb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bea72845-4b27-4381-b08b-e0570c67bddb" (UID: "bea72845-4b27-4381-b08b-e0570c67bddb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.259981 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea72845-4b27-4381-b08b-e0570c67bddb-kube-api-access-8n5nc" (OuterVolumeSpecName: "kube-api-access-8n5nc") pod "bea72845-4b27-4381-b08b-e0570c67bddb" (UID: "bea72845-4b27-4381-b08b-e0570c67bddb"). InnerVolumeSpecName "kube-api-access-8n5nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.355793 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n5nc\" (UniqueName: \"kubernetes.io/projected/bea72845-4b27-4381-b08b-e0570c67bddb-kube-api-access-8n5nc\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.355842 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea72845-4b27-4381-b08b-e0570c67bddb-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.355855 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea72845-4b27-4381-b08b-e0570c67bddb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.864948 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" event={"ID":"bea72845-4b27-4381-b08b-e0570c67bddb","Type":"ContainerDied","Data":"1680f1b14b94b34a9719dba11e6814c744db638f9e5b99a93b4ed4014f924dba"} Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.865003 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1680f1b14b94b34a9719dba11e6814c744db638f9e5b99a93b4ed4014f924dba" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.865055 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.343110 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-khqrg"] Mar 18 09:15:40 crc kubenswrapper[4778]: E0318 09:15:40.344104 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea72845-4b27-4381-b08b-e0570c67bddb" containerName="collect-profiles" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.344122 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea72845-4b27-4381-b08b-e0570c67bddb" containerName="collect-profiles" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.344256 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea72845-4b27-4381-b08b-e0570c67bddb" containerName="collect-profiles" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.344913 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.346642 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-qrqw4"] Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.347277 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.347446 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-qrqw4" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.347728 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-knchm" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.349078 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.350562 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-gsvnx" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.362416 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-khqrg"] Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.376786 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-qrqw4"] Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.381284 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hjskg"] Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.382300 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.387179 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-gnbr4" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.397493 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hjskg"] Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.498579 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6rbf\" (UniqueName: \"kubernetes.io/projected/24a88e8d-e986-4b3d-a77e-1a3e5162ac9c-kube-api-access-p6rbf\") pod \"cert-manager-cainjector-cf98fcc89-khqrg\" (UID: \"24a88e8d-e986-4b3d-a77e-1a3e5162ac9c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.498660 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvlw6\" (UniqueName: \"kubernetes.io/projected/e39be52c-c244-44cc-a707-0ec9994991fa-kube-api-access-wvlw6\") pod \"cert-manager-858654f9db-qrqw4\" (UID: \"e39be52c-c244-44cc-a707-0ec9994991fa\") " pod="cert-manager/cert-manager-858654f9db-qrqw4" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.498704 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v6l6\" (UniqueName: \"kubernetes.io/projected/f09bc4b7-d305-4674-8540-283bd0b4901c-kube-api-access-8v6l6\") pod \"cert-manager-webhook-687f57d79b-hjskg\" (UID: \"f09bc4b7-d305-4674-8540-283bd0b4901c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.599957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6rbf\" (UniqueName: \"kubernetes.io/projected/24a88e8d-e986-4b3d-a77e-1a3e5162ac9c-kube-api-access-p6rbf\") pod \"cert-manager-cainjector-cf98fcc89-khqrg\" (UID: \"24a88e8d-e986-4b3d-a77e-1a3e5162ac9c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.600499 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvlw6\" (UniqueName: \"kubernetes.io/projected/e39be52c-c244-44cc-a707-0ec9994991fa-kube-api-access-wvlw6\") pod \"cert-manager-858654f9db-qrqw4\" (UID: \"e39be52c-c244-44cc-a707-0ec9994991fa\") " pod="cert-manager/cert-manager-858654f9db-qrqw4" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.600649 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v6l6\" (UniqueName: \"kubernetes.io/projected/f09bc4b7-d305-4674-8540-283bd0b4901c-kube-api-access-8v6l6\") pod \"cert-manager-webhook-687f57d79b-hjskg\" (UID: \"f09bc4b7-d305-4674-8540-283bd0b4901c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.624162 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v6l6\" (UniqueName: \"kubernetes.io/projected/f09bc4b7-d305-4674-8540-283bd0b4901c-kube-api-access-8v6l6\") pod \"cert-manager-webhook-687f57d79b-hjskg\" (UID: \"f09bc4b7-d305-4674-8540-283bd0b4901c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.624981 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6rbf\" (UniqueName: \"kubernetes.io/projected/24a88e8d-e986-4b3d-a77e-1a3e5162ac9c-kube-api-access-p6rbf\") pod \"cert-manager-cainjector-cf98fcc89-khqrg\" (UID: \"24a88e8d-e986-4b3d-a77e-1a3e5162ac9c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.626060 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvlw6\" (UniqueName: \"kubernetes.io/projected/e39be52c-c244-44cc-a707-0ec9994991fa-kube-api-access-wvlw6\") pod \"cert-manager-858654f9db-qrqw4\" (UID: \"e39be52c-c244-44cc-a707-0ec9994991fa\") " pod="cert-manager/cert-manager-858654f9db-qrqw4" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.676712 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.682425 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-qrqw4" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.703260 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.932545 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-qrqw4"] Mar 18 09:15:41 crc kubenswrapper[4778]: I0318 09:15:41.048690 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hjskg"] Mar 18 09:15:41 crc kubenswrapper[4778]: I0318 09:15:41.117088 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" event={"ID":"f09bc4b7-d305-4674-8540-283bd0b4901c","Type":"ContainerStarted","Data":"2636fa1736c531caefda46d06d12a012800db27e5a6b874ddc7d8e3f9541556d"} Mar 18 09:15:41 crc kubenswrapper[4778]: I0318 09:15:41.118921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-qrqw4" event={"ID":"e39be52c-c244-44cc-a707-0ec9994991fa","Type":"ContainerStarted","Data":"bf6a92a55d75d6ad6258d8f91d078b85e774d97d2928d96666c717f85f7782bf"} Mar 18 09:15:41 crc kubenswrapper[4778]: I0318 09:15:41.208952 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-khqrg"] Mar 18 09:15:41 crc kubenswrapper[4778]: W0318 09:15:41.211838 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a88e8d_e986_4b3d_a77e_1a3e5162ac9c.slice/crio-0284f6c8d4aafa5f2fd98d8177af50b62207aae5dffd33ca1f450ca5d211f478 WatchSource:0}: Error finding container 0284f6c8d4aafa5f2fd98d8177af50b62207aae5dffd33ca1f450ca5d211f478: Status 404 returned error can't find the container with id 0284f6c8d4aafa5f2fd98d8177af50b62207aae5dffd33ca1f450ca5d211f478 Mar 18 09:15:42 crc kubenswrapper[4778]: I0318 09:15:42.135945 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" event={"ID":"24a88e8d-e986-4b3d-a77e-1a3e5162ac9c","Type":"ContainerStarted","Data":"0284f6c8d4aafa5f2fd98d8177af50b62207aae5dffd33ca1f450ca5d211f478"} Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.167500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-qrqw4" event={"ID":"e39be52c-c244-44cc-a707-0ec9994991fa","Type":"ContainerStarted","Data":"8f9cfe7da34ac42a2298776c6bb8ad1ccbec9bf0c6b0ab3b018a2f639e8bdd94"} Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.171720 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" event={"ID":"24a88e8d-e986-4b3d-a77e-1a3e5162ac9c","Type":"ContainerStarted","Data":"5858b5fbe03770a0084b32004e9ab61fd80c6618dc9a4f60dce661ecdd4cc189"} Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.175488 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" event={"ID":"f09bc4b7-d305-4674-8540-283bd0b4901c","Type":"ContainerStarted","Data":"4579c79beff6826d27512141d9d9cce950a0840faa26c3954db5d83d65971556"} Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.176240 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.199027 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-qrqw4" podStartSLOduration=2.174405188 podStartE2EDuration="6.198996801s" podCreationTimestamp="2026-03-18 09:15:40 +0000 UTC" firstStartedPulling="2026-03-18 09:15:40.948743108 +0000 UTC m=+807.523487948" lastFinishedPulling="2026-03-18 09:15:44.973334701 +0000 UTC m=+811.548079561" observedRunningTime="2026-03-18 09:15:46.189018867 +0000 UTC m=+812.763763767" watchObservedRunningTime="2026-03-18 09:15:46.198996801 +0000 UTC m=+812.773741681" Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.235445 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" podStartSLOduration=3.199536116 podStartE2EDuration="6.23541984s" podCreationTimestamp="2026-03-18 09:15:40 +0000 UTC" firstStartedPulling="2026-03-18 09:15:41.215498675 +0000 UTC m=+807.790243515" lastFinishedPulling="2026-03-18 09:15:44.251382399 +0000 UTC m=+810.826127239" observedRunningTime="2026-03-18 09:15:46.222859885 +0000 UTC m=+812.797604745" watchObservedRunningTime="2026-03-18 09:15:46.23541984 +0000 UTC m=+812.810164710" Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.241763 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" podStartSLOduration=2.381147259 podStartE2EDuration="6.241743774s" podCreationTimestamp="2026-03-18 09:15:40 +0000 UTC" firstStartedPulling="2026-03-18 09:15:41.05600991 +0000 UTC m=+807.630754750" lastFinishedPulling="2026-03-18 09:15:44.916606415 +0000 UTC m=+811.491351265" observedRunningTime="2026-03-18 09:15:46.23943095 +0000 UTC m=+812.814175810" watchObservedRunningTime="2026-03-18 09:15:46.241743774 +0000 UTC m=+812.816488654" Mar 18 09:15:50 crc kubenswrapper[4778]: I0318 09:15:50.708881 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.045386 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g2qth"] Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047020 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-controller" containerID="cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047156 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="sbdb" containerID="cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047099 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-node" containerID="cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047260 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-acl-logging" containerID="cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047136 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047153 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="nbdb" containerID="cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047037 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="northd" containerID="cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.106258 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" containerID="cri-o://236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.226885 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/2.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.227506 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/1.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.227611 4778 generic.go:334] "Generic (PLEG): container finished" podID="dce973f3-25e6-4536-87cc-9b46499ad7cf" containerID="f3ed13c08ae99a4b04465da39cc132336b6c856e9f5e19d24c954fe658b51ed0" exitCode=2 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.227728 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerDied","Data":"f3ed13c08ae99a4b04465da39cc132336b6c856e9f5e19d24c954fe658b51ed0"} Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.227834 4778 scope.go:117] "RemoveContainer" containerID="3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.228437 4778 scope.go:117] "RemoveContainer" containerID="f3ed13c08ae99a4b04465da39cc132336b6c856e9f5e19d24c954fe658b51ed0" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.228689 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-r2lvf_openshift-multus(dce973f3-25e6-4536-87cc-9b46499ad7cf)\"" pod="openshift-multus/multus-r2lvf" podUID="dce973f3-25e6-4536-87cc-9b46499ad7cf" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.241506 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/3.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.250624 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovn-acl-logging/0.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.251134 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovn-controller/0.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.252863 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" exitCode=0 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.252961 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b"} Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.253035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5"} Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.252988 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" exitCode=0 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.253060 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" exitCode=143 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.253075 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" exitCode=143 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.253095 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0"} Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.253104 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f"} Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.393893 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/3.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.397279 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovn-acl-logging/0.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.398312 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovn-controller/0.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.398834 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475214 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wj8wh"] Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475607 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-node" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475629 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-node" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475650 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475661 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475673 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="northd" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475683 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="northd" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475697 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="sbdb" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475705 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="sbdb" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475715 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475723 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475735 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-acl-logging" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475744 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-acl-logging" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475753 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="nbdb" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475761 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="nbdb" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475772 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kubecfg-setup" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475783 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kubecfg-setup" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475795 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475802 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475812 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475822 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475833 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475842 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475854 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475862 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476019 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-acl-logging" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476034 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476043 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476051 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476060 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476072 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-node" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476079 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476088 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="nbdb" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476100 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="sbdb" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476112 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="northd" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.476264 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476277 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476415 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476427 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.478918 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502481 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-netns\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502557 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-kubelet\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502599 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502643 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502626 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-netd\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502687 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502738 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-log-socket\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502763 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502791 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-openvswitch\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502826 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-log-socket" (OuterVolumeSpecName: "log-socket") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502847 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-env-overrides\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502859 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502881 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502888 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-systemd-units\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502911 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-node-log\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502932 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-slash\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502969 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovn-node-metrics-cert\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502995 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-var-lib-openvswitch\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503015 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-etc-openvswitch\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503041 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-config\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503079 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-bin\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503101 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-systemd\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503135 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-script-lib\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503160 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8g6f\" (UniqueName: \"kubernetes.io/projected/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-kube-api-access-b8g6f\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503212 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-ovn-kubernetes\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503308 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-ovn\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503326 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503358 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503376 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503394 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-node-log" (OuterVolumeSpecName: "node-log") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503411 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-slash" (OuterVolumeSpecName: "host-slash") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503810 4778 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503825 4778 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503835 4778 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503843 4778 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503851 4778 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503860 4778 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503867 4778 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503876 4778 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503886 4778 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503894 4778 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503902 4778 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503934 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.504266 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.504295 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.504388 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.504437 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.504932 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.510803 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.515508 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-kube-api-access-b8g6f" (OuterVolumeSpecName: "kube-api-access-b8g6f") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "kube-api-access-b8g6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.529528 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605060 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-node-log\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605139 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-cni-netd\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605191 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovn-node-metrics-cert\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605306 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-log-socket\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605363 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-ovn\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605454 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605559 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605758 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-var-lib-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605863 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-kubelet\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605945 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5brj\" (UniqueName: \"kubernetes.io/projected/ffcc7085-b304-4ae4-a764-907d0ce857ea-kube-api-access-f5brj\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606177 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovnkube-script-lib\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606249 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-cni-bin\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606307 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovnkube-config\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606333 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-etc-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606369 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606452 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-run-netns\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606499 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-slash\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606536 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-systemd\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606563 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-systemd-units\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606704 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-env-overrides\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606816 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606840 4778 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606853 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606866 4778 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606877 4778 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606888 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606901 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8g6f\" (UniqueName: \"kubernetes.io/projected/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-kube-api-access-b8g6f\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606912 4778 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606927 4778 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.707934 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-systemd-units\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708030 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-env-overrides\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708074 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-node-log\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708119 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-cni-netd\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708132 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-systemd-units\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708159 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovn-node-metrics-cert\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708370 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-log-socket\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708404 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-cni-netd\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-ovn\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708461 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-node-log\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708518 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-ovn\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708528 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708566 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-log-socket\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708572 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708606 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708675 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-var-lib-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708731 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-kubelet\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708726 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708760 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-var-lib-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708784 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5brj\" (UniqueName: \"kubernetes.io/projected/ffcc7085-b304-4ae4-a764-907d0ce857ea-kube-api-access-f5brj\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708838 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovnkube-script-lib\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708868 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-kubelet\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708894 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-cni-bin\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708946 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-cni-bin\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.709817 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovnkube-config\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.709883 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-etc-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.709961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.709545 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-env-overrides\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710012 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710028 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-run-netns\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovnkube-script-lib\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.709968 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-etc-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710086 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-slash\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710138 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-slash\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710157 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-run-netns\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710231 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-systemd\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710143 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-systemd\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710977 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovnkube-config\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.714181 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovn-node-metrics-cert\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.740048 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5brj\" (UniqueName: \"kubernetes.io/projected/ffcc7085-b304-4ae4-a764-907d0ce857ea-kube-api-access-f5brj\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.801240 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: W0318 09:15:53.837794 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffcc7085_b304_4ae4_a764_907d0ce857ea.slice/crio-f5e3196f4e1df6024633bf365c8a62435b4fd2f1fa37d7df196ed30f481ed35c WatchSource:0}: Error finding container f5e3196f4e1df6024633bf365c8a62435b4fd2f1fa37d7df196ed30f481ed35c: Status 404 returned error can't find the container with id f5e3196f4e1df6024633bf365c8a62435b4fd2f1fa37d7df196ed30f481ed35c Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.260921 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/2.log" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.264786 4778 generic.go:334] "Generic (PLEG): container finished" podID="ffcc7085-b304-4ae4-a764-907d0ce857ea" containerID="d21377864ded5d0a1002ab2279c8c423e8c8e6ade6bb88e057bc60a56e81ccbd" exitCode=0 Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.264861 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerDied","Data":"d21377864ded5d0a1002ab2279c8c423e8c8e6ade6bb88e057bc60a56e81ccbd"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.265143 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"f5e3196f4e1df6024633bf365c8a62435b4fd2f1fa37d7df196ed30f481ed35c"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.267975 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/3.log" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.272912 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovn-acl-logging/0.log" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274120 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovn-controller/0.log" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274520 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" exitCode=0 Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274544 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" exitCode=0 Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274553 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" exitCode=0 Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274563 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" exitCode=0 Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274586 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274618 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274645 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"0662ffa0aa3c3fc99a955a63b995acc6a492d7cf6b911968c3e8039e26cdcb7f"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274677 4778 scope.go:117] "RemoveContainer" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274854 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.302305 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.330291 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g2qth"] Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.333937 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g2qth"] Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.353097 4778 scope.go:117] "RemoveContainer" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.397621 4778 scope.go:117] "RemoveContainer" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.435086 4778 scope.go:117] "RemoveContainer" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.465432 4778 scope.go:117] "RemoveContainer" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.491254 4778 scope.go:117] "RemoveContainer" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.522375 4778 scope.go:117] "RemoveContainer" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.544939 4778 scope.go:117] "RemoveContainer" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.567325 4778 scope.go:117] "RemoveContainer" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.593498 4778 scope.go:117] "RemoveContainer" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.595104 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": container with ID starting with 236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb not found: ID does not exist" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.595143 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb"} err="failed to get container status \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": rpc error: code = NotFound desc = could not find container \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": container with ID starting with 236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.595172 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.595854 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": container with ID starting with 5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d not found: ID does not exist" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.595884 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d"} err="failed to get container status \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": rpc error: code = NotFound desc = could not find container \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": container with ID starting with 5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.595903 4778 scope.go:117] "RemoveContainer" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.597185 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": container with ID starting with 561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f not found: ID does not exist" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.597242 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f"} err="failed to get container status \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": rpc error: code = NotFound desc = could not find container \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": container with ID starting with 561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.597261 4778 scope.go:117] "RemoveContainer" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.597842 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": container with ID starting with 031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147 not found: ID does not exist" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.597894 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147"} err="failed to get container status \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": rpc error: code = NotFound desc = could not find container \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": container with ID starting with 031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.597920 4778 scope.go:117] "RemoveContainer" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.598397 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": container with ID starting with bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3 not found: ID does not exist" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.598435 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3"} err="failed to get container status \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": rpc error: code = NotFound desc = could not find container \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": container with ID starting with bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.598455 4778 scope.go:117] "RemoveContainer" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.598949 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": container with ID starting with 94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b not found: ID does not exist" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.598991 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b"} err="failed to get container status \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": rpc error: code = NotFound desc = could not find container \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": container with ID starting with 94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.599018 4778 scope.go:117] "RemoveContainer" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.599439 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": container with ID starting with 42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5 not found: ID does not exist" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.599472 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5"} err="failed to get container status \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": rpc error: code = NotFound desc = could not find container \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": container with ID starting with 42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.599491 4778 scope.go:117] "RemoveContainer" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.600051 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": container with ID starting with 8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0 not found: ID does not exist" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.600091 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0"} err="failed to get container status \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": rpc error: code = NotFound desc = could not find container \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": container with ID starting with 8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.600115 4778 scope.go:117] "RemoveContainer" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.600591 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": container with ID starting with 04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f not found: ID does not exist" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.600645 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f"} err="failed to get container status \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": rpc error: code = NotFound desc = could not find container \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": container with ID starting with 04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.600683 4778 scope.go:117] "RemoveContainer" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.601388 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": container with ID starting with 5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107 not found: ID does not exist" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.601484 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107"} err="failed to get container status \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": rpc error: code = NotFound desc = could not find container \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": container with ID starting with 5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.601533 4778 scope.go:117] "RemoveContainer" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.603804 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb"} err="failed to get container status \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": rpc error: code = NotFound desc = could not find container \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": container with ID starting with 236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.603835 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.604369 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d"} err="failed to get container status \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": rpc error: code = NotFound desc = could not find container \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": container with ID starting with 5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.604397 4778 scope.go:117] "RemoveContainer" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.604811 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f"} err="failed to get container status \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": rpc error: code = NotFound desc = could not find container \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": container with ID starting with 561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.604851 4778 scope.go:117] "RemoveContainer" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.606542 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147"} err="failed to get container status \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": rpc error: code = NotFound desc = could not find container \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": container with ID starting with 031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.606571 4778 scope.go:117] "RemoveContainer" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.606926 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3"} err="failed to get container status \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": rpc error: code = NotFound desc = could not find container \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": container with ID starting with bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.606951 4778 scope.go:117] "RemoveContainer" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.607245 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b"} err="failed to get container status \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": rpc error: code = NotFound desc = could not find container \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": container with ID starting with 94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.607278 4778 scope.go:117] "RemoveContainer" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.607704 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5"} err="failed to get container status \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": rpc error: code = NotFound desc = could not find container \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": container with ID starting with 42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.607742 4778 scope.go:117] "RemoveContainer" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.608019 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0"} err="failed to get container status \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": rpc error: code = NotFound desc = could not find container \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": container with ID starting with 8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.608049 4778 scope.go:117] "RemoveContainer" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.608663 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f"} err="failed to get container status \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": rpc error: code = NotFound desc = could not find container \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": container with ID starting with 04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.608690 4778 scope.go:117] "RemoveContainer" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.609542 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107"} err="failed to get container status \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": rpc error: code = NotFound desc = could not find container \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": container with ID starting with 5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.609569 4778 scope.go:117] "RemoveContainer" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.610045 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb"} err="failed to get container status \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": rpc error: code = NotFound desc = could not find container \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": container with ID starting with 236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.610077 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.610347 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d"} err="failed to get container status \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": rpc error: code = NotFound desc = could not find container \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": container with ID starting with 5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.610372 4778 scope.go:117] "RemoveContainer" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.610775 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f"} err="failed to get container status \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": rpc error: code = NotFound desc = could not find container \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": container with ID starting with 561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.610801 4778 scope.go:117] "RemoveContainer" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.611164 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147"} err="failed to get container status \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": rpc error: code = NotFound desc = could not find container \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": container with ID starting with 031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.611206 4778 scope.go:117] "RemoveContainer" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.611739 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3"} err="failed to get container status \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": rpc error: code = NotFound desc = could not find container \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": container with ID starting with bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.611765 4778 scope.go:117] "RemoveContainer" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.612069 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b"} err="failed to get container status \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": rpc error: code = NotFound desc = could not find container \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": container with ID starting with 94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.612111 4778 scope.go:117] "RemoveContainer" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.612552 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5"} err="failed to get container status \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": rpc error: code = NotFound desc = could not find container \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": container with ID starting with 42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.612583 4778 scope.go:117] "RemoveContainer" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.613152 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0"} err="failed to get container status \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": rpc error: code = NotFound desc = could not find container \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": container with ID starting with 8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.613184 4778 scope.go:117] "RemoveContainer" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.614059 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f"} err="failed to get container status \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": rpc error: code = NotFound desc = could not find container \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": container with ID starting with 04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.614105 4778 scope.go:117] "RemoveContainer" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.614537 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107"} err="failed to get container status \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": rpc error: code = NotFound desc = could not find container \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": container with ID starting with 5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.614562 4778 scope.go:117] "RemoveContainer" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.614957 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb"} err="failed to get container status \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": rpc error: code = NotFound desc = could not find container \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": container with ID starting with 236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.614980 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.616995 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d"} err="failed to get container status \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": rpc error: code = NotFound desc = could not find container \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": container with ID starting with 5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.617024 4778 scope.go:117] "RemoveContainer" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.618527 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f"} err="failed to get container status \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": rpc error: code = NotFound desc = could not find container \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": container with ID starting with 561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.618618 4778 scope.go:117] "RemoveContainer" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.619296 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147"} err="failed to get container status \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": rpc error: code = NotFound desc = could not find container \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": container with ID starting with 031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.619326 4778 scope.go:117] "RemoveContainer" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.620171 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3"} err="failed to get container status \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": rpc error: code = NotFound desc = could not find container \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": container with ID starting with bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.620402 4778 scope.go:117] "RemoveContainer" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.620789 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b"} err="failed to get container status \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": rpc error: code = NotFound desc = could not find container \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": container with ID starting with 94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.620833 4778 scope.go:117] "RemoveContainer" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.621267 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5"} err="failed to get container status \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": rpc error: code = NotFound desc = could not find container \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": container with ID starting with 42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.621298 4778 scope.go:117] "RemoveContainer" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.621609 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0"} err="failed to get container status \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": rpc error: code = NotFound desc = could not find container \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": container with ID starting with 8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.621735 4778 scope.go:117] "RemoveContainer" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.622211 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f"} err="failed to get container status \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": rpc error: code = NotFound desc = could not find container \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": container with ID starting with 04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.622242 4778 scope.go:117] "RemoveContainer" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.622661 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107"} err="failed to get container status \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": rpc error: code = NotFound desc = could not find container \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": container with ID starting with 5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107 not found: ID does not exist" Mar 18 09:15:55 crc kubenswrapper[4778]: I0318 09:15:55.289753 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"bb9f0f2dbf78111ae61c32607460c52a5289f21c493f1d11b9f0136de240ace7"} Mar 18 09:15:55 crc kubenswrapper[4778]: I0318 09:15:55.290148 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"b2762b2a8a9cf60abc3e1a47b5fcff9c1ac332c758dc5db6f836932298dc703e"} Mar 18 09:15:55 crc kubenswrapper[4778]: I0318 09:15:55.290162 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"e88829a78b81baf658a9f6aa474c13edfa36d9293af979a5af3163ee7894111f"} Mar 18 09:15:55 crc kubenswrapper[4778]: I0318 09:15:55.290229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"97fde475db8479dc5f6dd9b3f54e772b9f4851994e2501e2cd66c560c2e98472"} Mar 18 09:15:55 crc kubenswrapper[4778]: I0318 09:15:55.290242 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"9aae53f4ee819d01522f1b867eefa3722b585d8ed0122d111c1ba18a2f29665b"} Mar 18 09:15:55 crc kubenswrapper[4778]: I0318 09:15:55.290254 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"bfaca16b71999384966203a18082f0cb7de1868a00c74057228bd44e3414e847"} Mar 18 09:15:56 crc kubenswrapper[4778]: I0318 09:15:56.199864 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" path="/var/lib/kubelet/pods/ef97d63e-1caf-44c9-ac0c-9b03dbd05113/volumes" Mar 18 09:15:58 crc kubenswrapper[4778]: I0318 09:15:58.317049 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"6df40edb7a46800fdfba44f52a7229fa9eb0ec956d0deec9facb75e649abfd37"} Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.137582 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563756-s8bkt"] Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.138867 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.143632 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.143826 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.143996 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.147598 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.147693 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.216513 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lmf7\" (UniqueName: \"kubernetes.io/projected/a6298370-ed2e-4705-827b-c1a77b03f32a-kube-api-access-5lmf7\") pod \"auto-csr-approver-29563756-s8bkt\" (UID: \"a6298370-ed2e-4705-827b-c1a77b03f32a\") " pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.317790 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lmf7\" (UniqueName: \"kubernetes.io/projected/a6298370-ed2e-4705-827b-c1a77b03f32a-kube-api-access-5lmf7\") pod \"auto-csr-approver-29563756-s8bkt\" (UID: \"a6298370-ed2e-4705-827b-c1a77b03f32a\") " pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.339229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"f75ad8a40abb1f4d6e6a8fbaf0774fc78e6261a9216a78d79ac21cd4ac817e46"} Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.339677 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.339701 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.341624 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lmf7\" (UniqueName: \"kubernetes.io/projected/a6298370-ed2e-4705-827b-c1a77b03f32a-kube-api-access-5lmf7\") pod \"auto-csr-approver-29563756-s8bkt\" (UID: \"a6298370-ed2e-4705-827b-c1a77b03f32a\") " pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.369966 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.376295 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" podStartSLOduration=7.376278503 podStartE2EDuration="7.376278503s" podCreationTimestamp="2026-03-18 09:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:16:00.372507609 +0000 UTC m=+826.947252459" watchObservedRunningTime="2026-03-18 09:16:00.376278503 +0000 UTC m=+826.951023343" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.458391 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: E0318 09:16:00.487407 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(06f5228725be3d4a74527b764f3454a7538859a5ace127e774f22fdc891659f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:16:00 crc kubenswrapper[4778]: E0318 09:16:00.487510 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(06f5228725be3d4a74527b764f3454a7538859a5ace127e774f22fdc891659f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: E0318 09:16:00.487538 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(06f5228725be3d4a74527b764f3454a7538859a5ace127e774f22fdc891659f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: E0318 09:16:00.487606 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29563756-s8bkt_openshift-infra(a6298370-ed2e-4705-827b-c1a77b03f32a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29563756-s8bkt_openshift-infra(a6298370-ed2e-4705-827b-c1a77b03f32a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(06f5228725be3d4a74527b764f3454a7538859a5ace127e774f22fdc891659f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.572564 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563756-s8bkt"] Mar 18 09:16:01 crc kubenswrapper[4778]: I0318 09:16:01.344479 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:01 crc kubenswrapper[4778]: I0318 09:16:01.345019 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:01 crc kubenswrapper[4778]: I0318 09:16:01.345148 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:16:01 crc kubenswrapper[4778]: E0318 09:16:01.377753 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(00819e6d6edcaf45089b4400e517bd0e65957d461a51981cccb703db10967cb5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:16:01 crc kubenswrapper[4778]: E0318 09:16:01.377841 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(00819e6d6edcaf45089b4400e517bd0e65957d461a51981cccb703db10967cb5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:01 crc kubenswrapper[4778]: E0318 09:16:01.377879 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(00819e6d6edcaf45089b4400e517bd0e65957d461a51981cccb703db10967cb5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:01 crc kubenswrapper[4778]: E0318 09:16:01.377940 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29563756-s8bkt_openshift-infra(a6298370-ed2e-4705-827b-c1a77b03f32a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29563756-s8bkt_openshift-infra(a6298370-ed2e-4705-827b-c1a77b03f32a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(00819e6d6edcaf45089b4400e517bd0e65957d461a51981cccb703db10967cb5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" Mar 18 09:16:01 crc kubenswrapper[4778]: I0318 09:16:01.396112 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:16:04 crc kubenswrapper[4778]: I0318 09:16:04.192780 4778 scope.go:117] "RemoveContainer" containerID="f3ed13c08ae99a4b04465da39cc132336b6c856e9f5e19d24c954fe658b51ed0" Mar 18 09:16:04 crc kubenswrapper[4778]: E0318 09:16:04.193601 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-r2lvf_openshift-multus(dce973f3-25e6-4536-87cc-9b46499ad7cf)\"" pod="openshift-multus/multus-r2lvf" podUID="dce973f3-25e6-4536-87cc-9b46499ad7cf" Mar 18 09:16:12 crc kubenswrapper[4778]: I0318 09:16:12.187128 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:12 crc kubenswrapper[4778]: I0318 09:16:12.188589 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:12 crc kubenswrapper[4778]: E0318 09:16:12.243443 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(5546cbe4db08d0c0e9da6321547de08fd0ab86f621eae69fd3a3dd13b707a887): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:16:12 crc kubenswrapper[4778]: E0318 09:16:12.243568 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(5546cbe4db08d0c0e9da6321547de08fd0ab86f621eae69fd3a3dd13b707a887): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:12 crc kubenswrapper[4778]: E0318 09:16:12.243610 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(5546cbe4db08d0c0e9da6321547de08fd0ab86f621eae69fd3a3dd13b707a887): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:12 crc kubenswrapper[4778]: E0318 09:16:12.243694 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29563756-s8bkt_openshift-infra(a6298370-ed2e-4705-827b-c1a77b03f32a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29563756-s8bkt_openshift-infra(a6298370-ed2e-4705-827b-c1a77b03f32a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(5546cbe4db08d0c0e9da6321547de08fd0ab86f621eae69fd3a3dd13b707a887): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" Mar 18 09:16:17 crc kubenswrapper[4778]: I0318 09:16:17.187089 4778 scope.go:117] "RemoveContainer" containerID="f3ed13c08ae99a4b04465da39cc132336b6c856e9f5e19d24c954fe658b51ed0" Mar 18 09:16:17 crc kubenswrapper[4778]: I0318 09:16:17.463939 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/2.log" Mar 18 09:16:17 crc kubenswrapper[4778]: I0318 09:16:17.464326 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerStarted","Data":"9bfe3dd19d78b636a423d27ec05bd54a506754859e25a7865f4f2ed5351ab160"} Mar 18 09:16:23 crc kubenswrapper[4778]: I0318 09:16:23.838537 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:16:24 crc kubenswrapper[4778]: I0318 09:16:24.187003 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:24 crc kubenswrapper[4778]: I0318 09:16:24.191176 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:24 crc kubenswrapper[4778]: I0318 09:16:24.499766 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563756-s8bkt"] Mar 18 09:16:24 crc kubenswrapper[4778]: I0318 09:16:24.520849 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" event={"ID":"a6298370-ed2e-4705-827b-c1a77b03f32a","Type":"ContainerStarted","Data":"0e8064a1cd9a776938fe866aee052971920f428d338552bf9922173d39cb35da"} Mar 18 09:16:26 crc kubenswrapper[4778]: I0318 09:16:26.537126 4778 generic.go:334] "Generic (PLEG): container finished" podID="a6298370-ed2e-4705-827b-c1a77b03f32a" containerID="4909b98cff116d6eb4c151d4ba3b46f1a567c070f760a907d7a4e8ea4dca9196" exitCode=0 Mar 18 09:16:26 crc kubenswrapper[4778]: I0318 09:16:26.537328 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" event={"ID":"a6298370-ed2e-4705-827b-c1a77b03f32a","Type":"ContainerDied","Data":"4909b98cff116d6eb4c151d4ba3b46f1a567c070f760a907d7a4e8ea4dca9196"} Mar 18 09:16:27 crc kubenswrapper[4778]: I0318 09:16:27.864418 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.022159 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lmf7\" (UniqueName: \"kubernetes.io/projected/a6298370-ed2e-4705-827b-c1a77b03f32a-kube-api-access-5lmf7\") pod \"a6298370-ed2e-4705-827b-c1a77b03f32a\" (UID: \"a6298370-ed2e-4705-827b-c1a77b03f32a\") " Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.041100 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6298370-ed2e-4705-827b-c1a77b03f32a-kube-api-access-5lmf7" (OuterVolumeSpecName: "kube-api-access-5lmf7") pod "a6298370-ed2e-4705-827b-c1a77b03f32a" (UID: "a6298370-ed2e-4705-827b-c1a77b03f32a"). InnerVolumeSpecName "kube-api-access-5lmf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.124384 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lmf7\" (UniqueName: \"kubernetes.io/projected/a6298370-ed2e-4705-827b-c1a77b03f32a-kube-api-access-5lmf7\") on node \"crc\" DevicePath \"\"" Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.553279 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" event={"ID":"a6298370-ed2e-4705-827b-c1a77b03f32a","Type":"ContainerDied","Data":"0e8064a1cd9a776938fe866aee052971920f428d338552bf9922173d39cb35da"} Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.553333 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e8064a1cd9a776938fe866aee052971920f428d338552bf9922173d39cb35da" Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.553404 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.941063 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563750-lv4gn"] Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.944287 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563750-lv4gn"] Mar 18 09:16:30 crc kubenswrapper[4778]: I0318 09:16:30.147709 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:16:30 crc kubenswrapper[4778]: I0318 09:16:30.147797 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:16:30 crc kubenswrapper[4778]: I0318 09:16:30.201074 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="105b6b5d-09f6-48c8-862e-c17526c6d6c7" path="/var/lib/kubelet/pods/105b6b5d-09f6-48c8-862e-c17526c6d6c7/volumes" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.326104 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49"] Mar 18 09:16:34 crc kubenswrapper[4778]: E0318 09:16:34.326990 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" containerName="oc" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.327005 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" containerName="oc" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.327123 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" containerName="oc" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.328187 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.330580 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.338923 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49"] Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.408860 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.408969 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q4fc\" (UniqueName: \"kubernetes.io/projected/85a942ea-cebf-408c-95b8-f435630b20ad-kube-api-access-9q4fc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.409008 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.510240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q4fc\" (UniqueName: \"kubernetes.io/projected/85a942ea-cebf-408c-95b8-f435630b20ad-kube-api-access-9q4fc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.510551 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.510599 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.511279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.511279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.531444 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q4fc\" (UniqueName: \"kubernetes.io/projected/85a942ea-cebf-408c-95b8-f435630b20ad-kube-api-access-9q4fc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.685469 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.908156 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49"] Mar 18 09:16:35 crc kubenswrapper[4778]: I0318 09:16:35.603877 4778 generic.go:334] "Generic (PLEG): container finished" podID="85a942ea-cebf-408c-95b8-f435630b20ad" containerID="fb110c6531d7f01f64ebf0f6e90cfe932435a67f8179032a89235f6875dd767f" exitCode=0 Mar 18 09:16:35 crc kubenswrapper[4778]: I0318 09:16:35.603962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" event={"ID":"85a942ea-cebf-408c-95b8-f435630b20ad","Type":"ContainerDied","Data":"fb110c6531d7f01f64ebf0f6e90cfe932435a67f8179032a89235f6875dd767f"} Mar 18 09:16:35 crc kubenswrapper[4778]: I0318 09:16:35.604048 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" event={"ID":"85a942ea-cebf-408c-95b8-f435630b20ad","Type":"ContainerStarted","Data":"9f04029b414c3cb7688ae4afc622f10363ab38f8bb293083d74ffa8e39919aeb"} Mar 18 09:16:37 crc kubenswrapper[4778]: I0318 09:16:37.621778 4778 generic.go:334] "Generic (PLEG): container finished" podID="85a942ea-cebf-408c-95b8-f435630b20ad" containerID="d4b5149511ad14259bdd54f43fd9a3428baa75e9a79cb380d5bd48c26a00b834" exitCode=0 Mar 18 09:16:37 crc kubenswrapper[4778]: I0318 09:16:37.621855 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" event={"ID":"85a942ea-cebf-408c-95b8-f435630b20ad","Type":"ContainerDied","Data":"d4b5149511ad14259bdd54f43fd9a3428baa75e9a79cb380d5bd48c26a00b834"} Mar 18 09:16:38 crc kubenswrapper[4778]: I0318 09:16:38.632410 4778 generic.go:334] "Generic (PLEG): container finished" podID="85a942ea-cebf-408c-95b8-f435630b20ad" containerID="368bbf8e766766ba5c0481c8ff55994d3b55ac4a07f2ee40883fbee6779c77a0" exitCode=0 Mar 18 09:16:38 crc kubenswrapper[4778]: I0318 09:16:38.632473 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" event={"ID":"85a942ea-cebf-408c-95b8-f435630b20ad","Type":"ContainerDied","Data":"368bbf8e766766ba5c0481c8ff55994d3b55ac4a07f2ee40883fbee6779c77a0"} Mar 18 09:16:39 crc kubenswrapper[4778]: I0318 09:16:39.865037 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:39 crc kubenswrapper[4778]: I0318 09:16:39.997522 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-bundle\") pod \"85a942ea-cebf-408c-95b8-f435630b20ad\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " Mar 18 09:16:39 crc kubenswrapper[4778]: I0318 09:16:39.997728 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-util\") pod \"85a942ea-cebf-408c-95b8-f435630b20ad\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " Mar 18 09:16:39 crc kubenswrapper[4778]: I0318 09:16:39.997842 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q4fc\" (UniqueName: \"kubernetes.io/projected/85a942ea-cebf-408c-95b8-f435630b20ad-kube-api-access-9q4fc\") pod \"85a942ea-cebf-408c-95b8-f435630b20ad\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " Mar 18 09:16:39 crc kubenswrapper[4778]: I0318 09:16:39.999451 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-bundle" (OuterVolumeSpecName: "bundle") pod "85a942ea-cebf-408c-95b8-f435630b20ad" (UID: "85a942ea-cebf-408c-95b8-f435630b20ad"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.007691 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a942ea-cebf-408c-95b8-f435630b20ad-kube-api-access-9q4fc" (OuterVolumeSpecName: "kube-api-access-9q4fc") pod "85a942ea-cebf-408c-95b8-f435630b20ad" (UID: "85a942ea-cebf-408c-95b8-f435630b20ad"). InnerVolumeSpecName "kube-api-access-9q4fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.018763 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-util" (OuterVolumeSpecName: "util") pod "85a942ea-cebf-408c-95b8-f435630b20ad" (UID: "85a942ea-cebf-408c-95b8-f435630b20ad"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.099386 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.099439 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-util\") on node \"crc\" DevicePath \"\"" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.099457 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q4fc\" (UniqueName: \"kubernetes.io/projected/85a942ea-cebf-408c-95b8-f435630b20ad-kube-api-access-9q4fc\") on node \"crc\" DevicePath \"\"" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.648980 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" event={"ID":"85a942ea-cebf-408c-95b8-f435630b20ad","Type":"ContainerDied","Data":"9f04029b414c3cb7688ae4afc622f10363ab38f8bb293083d74ffa8e39919aeb"} Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.649041 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f04029b414c3cb7688ae4afc622f10363ab38f8bb293083d74ffa8e39919aeb" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.649116 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.977550 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls"] Mar 18 09:16:41 crc kubenswrapper[4778]: E0318 09:16:41.978403 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="util" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.978422 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="util" Mar 18 09:16:41 crc kubenswrapper[4778]: E0318 09:16:41.978439 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="pull" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.978448 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="pull" Mar 18 09:16:41 crc kubenswrapper[4778]: E0318 09:16:41.978468 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="extract" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.978475 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="extract" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.978583 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="extract" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.979241 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.981937 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.982983 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.983044 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-n2vhk" Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.001792 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls"] Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.127693 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cz7p\" (UniqueName: \"kubernetes.io/projected/1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe-kube-api-access-2cz7p\") pod \"nmstate-operator-796d4cfff4-sr9ls\" (UID: \"1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.228581 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cz7p\" (UniqueName: \"kubernetes.io/projected/1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe-kube-api-access-2cz7p\") pod \"nmstate-operator-796d4cfff4-sr9ls\" (UID: \"1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.252565 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cz7p\" (UniqueName: \"kubernetes.io/projected/1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe-kube-api-access-2cz7p\") pod \"nmstate-operator-796d4cfff4-sr9ls\" (UID: \"1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.294121 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.537831 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls"] Mar 18 09:16:42 crc kubenswrapper[4778]: W0318 09:16:42.548508 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dd7ccb2_0dca_4a6d_87f7_195b0ae0f9fe.slice/crio-45060fd7497241cce12ea1e0c6b83efe4318493b630a7487083e5183a64e385d WatchSource:0}: Error finding container 45060fd7497241cce12ea1e0c6b83efe4318493b630a7487083e5183a64e385d: Status 404 returned error can't find the container with id 45060fd7497241cce12ea1e0c6b83efe4318493b630a7487083e5183a64e385d Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.660731 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" event={"ID":"1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe","Type":"ContainerStarted","Data":"45060fd7497241cce12ea1e0c6b83efe4318493b630a7487083e5183a64e385d"} Mar 18 09:16:43 crc kubenswrapper[4778]: I0318 09:16:43.047571 4778 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 09:16:54 crc kubenswrapper[4778]: I0318 09:16:54.741462 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" event={"ID":"1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe","Type":"ContainerStarted","Data":"272b1b4d0558016cc87acf481ff47c38931f2048212373207ec3f01d3d0545d5"} Mar 18 09:16:54 crc kubenswrapper[4778]: I0318 09:16:54.769258 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" podStartSLOduration=2.251525774 podStartE2EDuration="13.769239495s" podCreationTimestamp="2026-03-18 09:16:41 +0000 UTC" firstStartedPulling="2026-03-18 09:16:42.551857892 +0000 UTC m=+869.126602732" lastFinishedPulling="2026-03-18 09:16:54.069571613 +0000 UTC m=+880.644316453" observedRunningTime="2026-03-18 09:16:54.766678105 +0000 UTC m=+881.341422975" watchObservedRunningTime="2026-03-18 09:16:54.769239495 +0000 UTC m=+881.343984345" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.721470 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.722248 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.725310 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-pnzh2" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.737901 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-thw7f"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.739523 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.742359 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.754559 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.765655 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-thw7f"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.792811 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-5thsf"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.794499 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.834125 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc9zm\" (UniqueName: \"kubernetes.io/projected/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-kube-api-access-kc9zm\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.834167 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.834190 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klz7l\" (UniqueName: \"kubernetes.io/projected/71b50b27-6084-4693-acbc-d14f36759618-kube-api-access-klz7l\") pod \"nmstate-metrics-9b8c8685d-wq8gr\" (UID: \"71b50b27-6084-4693-acbc-d14f36759618\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.886761 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.887672 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.890281 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6r2fb" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.890440 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.890656 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.900414 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935460 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf9dr\" (UniqueName: \"kubernetes.io/projected/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-kube-api-access-kf9dr\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935600 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-ovs-socket\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935681 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-nmstate-lock\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935753 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc9zm\" (UniqueName: \"kubernetes.io/projected/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-kube-api-access-kc9zm\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935784 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-dbus-socket\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935814 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klz7l\" (UniqueName: \"kubernetes.io/projected/71b50b27-6084-4693-acbc-d14f36759618-kube-api-access-klz7l\") pod \"nmstate-metrics-9b8c8685d-wq8gr\" (UID: \"71b50b27-6084-4693-acbc-d14f36759618\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" Mar 18 09:16:55 crc kubenswrapper[4778]: E0318 09:16:55.936057 4778 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 18 09:16:55 crc kubenswrapper[4778]: E0318 09:16:55.936106 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-tls-key-pair podName:5961b98d-a41a-4ceb-bb71-4bf3a0fc854d nodeName:}" failed. No retries permitted until 2026-03-18 09:16:56.436090212 +0000 UTC m=+883.010835052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-tls-key-pair") pod "nmstate-webhook-5f558f5558-thw7f" (UID: "5961b98d-a41a-4ceb-bb71-4bf3a0fc854d") : secret "openshift-nmstate-webhook" not found Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.959496 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klz7l\" (UniqueName: \"kubernetes.io/projected/71b50b27-6084-4693-acbc-d14f36759618-kube-api-access-klz7l\") pod \"nmstate-metrics-9b8c8685d-wq8gr\" (UID: \"71b50b27-6084-4693-acbc-d14f36759618\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.961849 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc9zm\" (UniqueName: \"kubernetes.io/projected/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-kube-api-access-kc9zm\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.037734 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8b636ef7-4b85-4506-bb2a-f89bee9b028d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.037835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-ovs-socket\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.037932 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b636ef7-4b85-4506-bb2a-f89bee9b028d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.037986 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-nmstate-lock\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.038053 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-nmstate-lock\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.037982 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-ovs-socket\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.038240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-dbus-socket\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.038374 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf9dr\" (UniqueName: \"kubernetes.io/projected/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-kube-api-access-kf9dr\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.038467 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klxq2\" (UniqueName: \"kubernetes.io/projected/8b636ef7-4b85-4506-bb2a-f89bee9b028d-kube-api-access-klxq2\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.038751 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.038922 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-dbus-socket\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.065140 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf9dr\" (UniqueName: \"kubernetes.io/projected/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-kube-api-access-kf9dr\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.098832 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-65b6b6f7c5-nrwfx"] Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.109817 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.119132 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.120737 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65b6b6f7c5-nrwfx"] Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.139835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klxq2\" (UniqueName: \"kubernetes.io/projected/8b636ef7-4b85-4506-bb2a-f89bee9b028d-kube-api-access-klxq2\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.140128 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8b636ef7-4b85-4506-bb2a-f89bee9b028d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.140284 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b636ef7-4b85-4506-bb2a-f89bee9b028d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: E0318 09:16:56.140512 4778 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 18 09:16:56 crc kubenswrapper[4778]: E0318 09:16:56.140661 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b636ef7-4b85-4506-bb2a-f89bee9b028d-plugin-serving-cert podName:8b636ef7-4b85-4506-bb2a-f89bee9b028d nodeName:}" failed. No retries permitted until 2026-03-18 09:16:56.640640583 +0000 UTC m=+883.215385433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/8b636ef7-4b85-4506-bb2a-f89bee9b028d-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-22c9p" (UID: "8b636ef7-4b85-4506-bb2a-f89bee9b028d") : secret "plugin-serving-cert" not found Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.141095 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8b636ef7-4b85-4506-bb2a-f89bee9b028d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.156820 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klxq2\" (UniqueName: \"kubernetes.io/projected/8b636ef7-4b85-4506-bb2a-f89bee9b028d-kube-api-access-klxq2\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241128 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-oauth-serving-cert\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241174 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-config\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241229 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-serving-cert\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241250 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsnvx\" (UniqueName: \"kubernetes.io/projected/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-kube-api-access-wsnvx\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241297 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-oauth-config\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241329 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-trusted-ca-bundle\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241375 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-service-ca\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.260618 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr"] Mar 18 09:16:56 crc kubenswrapper[4778]: W0318 09:16:56.267791 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71b50b27_6084_4693_acbc_d14f36759618.slice/crio-ebf7f4cde51e95f80ba790fac53e82acf099db419c398861f8469f6cc399183f WatchSource:0}: Error finding container ebf7f4cde51e95f80ba790fac53e82acf099db419c398861f8469f6cc399183f: Status 404 returned error can't find the container with id ebf7f4cde51e95f80ba790fac53e82acf099db419c398861f8469f6cc399183f Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.343428 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-oauth-serving-cert\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.343845 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-config\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.343906 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-serving-cert\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.343949 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsnvx\" (UniqueName: \"kubernetes.io/projected/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-kube-api-access-wsnvx\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.344103 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-oauth-config\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.344180 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-trusted-ca-bundle\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.344351 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-service-ca\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.344371 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-oauth-serving-cert\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.346246 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-config\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.346718 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-service-ca\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.347933 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-trusted-ca-bundle\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.350122 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-serving-cert\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.350692 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-oauth-config\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.361947 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsnvx\" (UniqueName: \"kubernetes.io/projected/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-kube-api-access-wsnvx\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.438773 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.446073 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.450759 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.649305 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b636ef7-4b85-4506-bb2a-f89bee9b028d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.654515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.655750 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b636ef7-4b85-4506-bb2a-f89bee9b028d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.720943 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65b6b6f7c5-nrwfx"] Mar 18 09:16:56 crc kubenswrapper[4778]: W0318 09:16:56.734590 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bb4ece6_9324_4551_9be5_b0d2f6b6d597.slice/crio-f6727198f91acacfa1507ae495cc6a99e7e02db7d492d388fe5ee3314cd21064 WatchSource:0}: Error finding container f6727198f91acacfa1507ae495cc6a99e7e02db7d492d388fe5ee3314cd21064: Status 404 returned error can't find the container with id f6727198f91acacfa1507ae495cc6a99e7e02db7d492d388fe5ee3314cd21064 Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.762857 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" event={"ID":"71b50b27-6084-4693-acbc-d14f36759618","Type":"ContainerStarted","Data":"ebf7f4cde51e95f80ba790fac53e82acf099db419c398861f8469f6cc399183f"} Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.763621 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65b6b6f7c5-nrwfx" event={"ID":"6bb4ece6-9324-4551-9be5-b0d2f6b6d597","Type":"ContainerStarted","Data":"f6727198f91acacfa1507ae495cc6a99e7e02db7d492d388fe5ee3314cd21064"} Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.766146 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5thsf" event={"ID":"5b97fa25-4d3d-4664-a5fc-41c98bbd272f","Type":"ContainerStarted","Data":"477e73f9bcf200c7dcf2019d15261cf8e232ed521ecf42122cc54af0613cf434"} Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.801630 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:57 crc kubenswrapper[4778]: I0318 09:16:57.019380 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p"] Mar 18 09:16:57 crc kubenswrapper[4778]: W0318 09:16:57.027112 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b636ef7_4b85_4506_bb2a_f89bee9b028d.slice/crio-46b4fb23f770c5e0c182ef28a909880470ccecab4e3a37c9ba0ef3f9406217be WatchSource:0}: Error finding container 46b4fb23f770c5e0c182ef28a909880470ccecab4e3a37c9ba0ef3f9406217be: Status 404 returned error can't find the container with id 46b4fb23f770c5e0c182ef28a909880470ccecab4e3a37c9ba0ef3f9406217be Mar 18 09:16:57 crc kubenswrapper[4778]: I0318 09:16:57.087730 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-thw7f"] Mar 18 09:16:57 crc kubenswrapper[4778]: W0318 09:16:57.090336 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5961b98d_a41a_4ceb_bb71_4bf3a0fc854d.slice/crio-a90915ed5b06b495028e5cd1931c6c45ba6b78135e8fa803a5fd2b12d02d6d62 WatchSource:0}: Error finding container a90915ed5b06b495028e5cd1931c6c45ba6b78135e8fa803a5fd2b12d02d6d62: Status 404 returned error can't find the container with id a90915ed5b06b495028e5cd1931c6c45ba6b78135e8fa803a5fd2b12d02d6d62 Mar 18 09:16:57 crc kubenswrapper[4778]: I0318 09:16:57.775455 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" event={"ID":"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d","Type":"ContainerStarted","Data":"a90915ed5b06b495028e5cd1931c6c45ba6b78135e8fa803a5fd2b12d02d6d62"} Mar 18 09:16:57 crc kubenswrapper[4778]: I0318 09:16:57.777756 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65b6b6f7c5-nrwfx" event={"ID":"6bb4ece6-9324-4551-9be5-b0d2f6b6d597","Type":"ContainerStarted","Data":"1c806c27ee9ca32e9f9d0916a8675121329938c0c4c9193173623a419df71be7"} Mar 18 09:16:57 crc kubenswrapper[4778]: I0318 09:16:57.780005 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" event={"ID":"8b636ef7-4b85-4506-bb2a-f89bee9b028d","Type":"ContainerStarted","Data":"46b4fb23f770c5e0c182ef28a909880470ccecab4e3a37c9ba0ef3f9406217be"} Mar 18 09:16:57 crc kubenswrapper[4778]: I0318 09:16:57.812958 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65b6b6f7c5-nrwfx" podStartSLOduration=1.8128924130000001 podStartE2EDuration="1.812892413s" podCreationTimestamp="2026-03-18 09:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:16:57.808495762 +0000 UTC m=+884.383240672" watchObservedRunningTime="2026-03-18 09:16:57.812892413 +0000 UTC m=+884.387637283" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.148101 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.148447 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.148487 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.148973 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bce801303fcc67febe135eadcc45500477ce5c50a4d9315c95b48aad6bc9b1d"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.149019 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://2bce801303fcc67febe135eadcc45500477ce5c50a4d9315c95b48aad6bc9b1d" gracePeriod=600 Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.803856 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5thsf" event={"ID":"5b97fa25-4d3d-4664-a5fc-41c98bbd272f","Type":"ContainerStarted","Data":"b3cf535dc0116b62d95f2699f650d59453ac02ed65b84a1ce82a040cda162d34"} Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.804260 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.806018 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" event={"ID":"8b636ef7-4b85-4506-bb2a-f89bee9b028d","Type":"ContainerStarted","Data":"0da7043a51a86c0b9a2a6c0f67cd3b54d94b5026b00e290015d06efa67915cb0"} Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.811119 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="2bce801303fcc67febe135eadcc45500477ce5c50a4d9315c95b48aad6bc9b1d" exitCode=0 Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.811185 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"2bce801303fcc67febe135eadcc45500477ce5c50a4d9315c95b48aad6bc9b1d"} Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.811233 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"ea61f21841c4e8176f6c189293fbb254c950982318bec3d46909c1b0e41c2f38"} Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.811258 4778 scope.go:117] "RemoveContainer" containerID="8352df71b83dce7216f889f9357e2f46d95d02444a6764ed6ea4a1748338a832" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.813024 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" event={"ID":"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d","Type":"ContainerStarted","Data":"1b8f72f0fe13159246a3839933c60ea581e784f55a5364f4519f5e862ba599aa"} Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.813241 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.851034 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-5thsf" podStartSLOduration=2.039830937 podStartE2EDuration="5.851007877s" podCreationTimestamp="2026-03-18 09:16:55 +0000 UTC" firstStartedPulling="2026-03-18 09:16:56.143141922 +0000 UTC m=+882.717886762" lastFinishedPulling="2026-03-18 09:16:59.954318842 +0000 UTC m=+886.529063702" observedRunningTime="2026-03-18 09:17:00.825414596 +0000 UTC m=+887.400159446" watchObservedRunningTime="2026-03-18 09:17:00.851007877 +0000 UTC m=+887.425752727" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.868402 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" podStartSLOduration=2.9408840229999997 podStartE2EDuration="5.868379094s" podCreationTimestamp="2026-03-18 09:16:55 +0000 UTC" firstStartedPulling="2026-03-18 09:16:57.029647168 +0000 UTC m=+883.604392008" lastFinishedPulling="2026-03-18 09:16:59.957142229 +0000 UTC m=+886.531887079" observedRunningTime="2026-03-18 09:17:00.86496016 +0000 UTC m=+887.439705050" watchObservedRunningTime="2026-03-18 09:17:00.868379094 +0000 UTC m=+887.443123944" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.888350 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" podStartSLOduration=3.024883796 podStartE2EDuration="5.88830957s" podCreationTimestamp="2026-03-18 09:16:55 +0000 UTC" firstStartedPulling="2026-03-18 09:16:57.09388736 +0000 UTC m=+883.668632210" lastFinishedPulling="2026-03-18 09:16:59.957313094 +0000 UTC m=+886.532057984" observedRunningTime="2026-03-18 09:17:00.882804229 +0000 UTC m=+887.457549079" watchObservedRunningTime="2026-03-18 09:17:00.88830957 +0000 UTC m=+887.463054420" Mar 18 09:17:01 crc kubenswrapper[4778]: I0318 09:17:01.532354 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:17:01 crc kubenswrapper[4778]: I0318 09:17:01.823579 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" event={"ID":"71b50b27-6084-4693-acbc-d14f36759618","Type":"ContainerStarted","Data":"e98d8d4ce2b85174d6090313f532e7c6f2a36c24e05b766a90f3aa888ba6301f"} Mar 18 09:17:04 crc kubenswrapper[4778]: I0318 09:17:04.889613 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" event={"ID":"71b50b27-6084-4693-acbc-d14f36759618","Type":"ContainerStarted","Data":"8bf22161969d45f052561ed6e4783da26e5240e98bbcbfef9362220487edf528"} Mar 18 09:17:04 crc kubenswrapper[4778]: I0318 09:17:04.909677 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" podStartSLOduration=2.061117241 podStartE2EDuration="9.909657236s" podCreationTimestamp="2026-03-18 09:16:55 +0000 UTC" firstStartedPulling="2026-03-18 09:16:56.273531008 +0000 UTC m=+882.848275848" lastFinishedPulling="2026-03-18 09:17:04.122071003 +0000 UTC m=+890.696815843" observedRunningTime="2026-03-18 09:17:04.907065225 +0000 UTC m=+891.481810155" watchObservedRunningTime="2026-03-18 09:17:04.909657236 +0000 UTC m=+891.484402086" Mar 18 09:17:06 crc kubenswrapper[4778]: I0318 09:17:06.152763 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:17:06 crc kubenswrapper[4778]: I0318 09:17:06.439931 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:17:06 crc kubenswrapper[4778]: I0318 09:17:06.440227 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:17:06 crc kubenswrapper[4778]: I0318 09:17:06.448873 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:17:06 crc kubenswrapper[4778]: I0318 09:17:06.913545 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:17:06 crc kubenswrapper[4778]: I0318 09:17:06.982608 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pgsqh"] Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.796377 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cbbb4"] Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.798016 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.812416 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbbb4"] Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.834976 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-utilities\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.835027 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmrmm\" (UniqueName: \"kubernetes.io/projected/e437a76c-331a-422b-aafb-036febcf9e98-kube-api-access-bmrmm\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.835399 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-catalog-content\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.937459 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-catalog-content\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.938253 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-catalog-content\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.939938 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-utilities\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.940542 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-utilities\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.941006 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmrmm\" (UniqueName: \"kubernetes.io/projected/e437a76c-331a-422b-aafb-036febcf9e98-kube-api-access-bmrmm\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.964759 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmrmm\" (UniqueName: \"kubernetes.io/projected/e437a76c-331a-422b-aafb-036febcf9e98-kube-api-access-bmrmm\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:08 crc kubenswrapper[4778]: I0318 09:17:08.116063 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:08 crc kubenswrapper[4778]: I0318 09:17:08.360130 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbbb4"] Mar 18 09:17:08 crc kubenswrapper[4778]: W0318 09:17:08.369237 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode437a76c_331a_422b_aafb_036febcf9e98.slice/crio-9dc755ad4dc74d5e7bbf029e4b451ae2af505c3ba4a46b3dc50587de84cf2990 WatchSource:0}: Error finding container 9dc755ad4dc74d5e7bbf029e4b451ae2af505c3ba4a46b3dc50587de84cf2990: Status 404 returned error can't find the container with id 9dc755ad4dc74d5e7bbf029e4b451ae2af505c3ba4a46b3dc50587de84cf2990 Mar 18 09:17:08 crc kubenswrapper[4778]: I0318 09:17:08.922724 4778 generic.go:334] "Generic (PLEG): container finished" podID="e437a76c-331a-422b-aafb-036febcf9e98" containerID="eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd" exitCode=0 Mar 18 09:17:08 crc kubenswrapper[4778]: I0318 09:17:08.922767 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbbb4" event={"ID":"e437a76c-331a-422b-aafb-036febcf9e98","Type":"ContainerDied","Data":"eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd"} Mar 18 09:17:08 crc kubenswrapper[4778]: I0318 09:17:08.922791 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbbb4" event={"ID":"e437a76c-331a-422b-aafb-036febcf9e98","Type":"ContainerStarted","Data":"9dc755ad4dc74d5e7bbf029e4b451ae2af505c3ba4a46b3dc50587de84cf2990"} Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.188401 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sxg2m"] Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.190589 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.219401 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxg2m"] Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.301608 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-catalog-content\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.301728 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-utilities\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.301791 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pn5q\" (UniqueName: \"kubernetes.io/projected/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-kube-api-access-6pn5q\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.403177 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-catalog-content\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.403255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-utilities\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.403289 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pn5q\" (UniqueName: \"kubernetes.io/projected/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-kube-api-access-6pn5q\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.404329 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-catalog-content\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.404376 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-utilities\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.427096 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pn5q\" (UniqueName: \"kubernetes.io/projected/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-kube-api-access-6pn5q\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.535301 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.775932 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxg2m"] Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.946922 4778 generic.go:334] "Generic (PLEG): container finished" podID="e437a76c-331a-422b-aafb-036febcf9e98" containerID="3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09" exitCode=0 Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.947035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbbb4" event={"ID":"e437a76c-331a-422b-aafb-036febcf9e98","Type":"ContainerDied","Data":"3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09"} Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.948249 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerStarted","Data":"ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c"} Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.948290 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerStarted","Data":"1e5d40c4001d2f6677961b892e3d7f1ae3ee8cd89360a0d7ba77365018ec7e6f"} Mar 18 09:17:12 crc kubenswrapper[4778]: I0318 09:17:12.957680 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbbb4" event={"ID":"e437a76c-331a-422b-aafb-036febcf9e98","Type":"ContainerStarted","Data":"a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588"} Mar 18 09:17:12 crc kubenswrapper[4778]: I0318 09:17:12.960088 4778 generic.go:334] "Generic (PLEG): container finished" podID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerID="ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c" exitCode=0 Mar 18 09:17:12 crc kubenswrapper[4778]: I0318 09:17:12.960116 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerDied","Data":"ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c"} Mar 18 09:17:12 crc kubenswrapper[4778]: I0318 09:17:12.983627 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cbbb4" podStartSLOduration=2.547354857 podStartE2EDuration="5.983606663s" podCreationTimestamp="2026-03-18 09:17:07 +0000 UTC" firstStartedPulling="2026-03-18 09:17:08.924556044 +0000 UTC m=+895.499300874" lastFinishedPulling="2026-03-18 09:17:12.36080784 +0000 UTC m=+898.935552680" observedRunningTime="2026-03-18 09:17:12.979936953 +0000 UTC m=+899.554681813" watchObservedRunningTime="2026-03-18 09:17:12.983606663 +0000 UTC m=+899.558351513" Mar 18 09:17:13 crc kubenswrapper[4778]: I0318 09:17:13.969164 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerStarted","Data":"a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec"} Mar 18 09:17:14 crc kubenswrapper[4778]: I0318 09:17:14.981825 4778 generic.go:334] "Generic (PLEG): container finished" podID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerID="a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec" exitCode=0 Mar 18 09:17:14 crc kubenswrapper[4778]: I0318 09:17:14.981910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerDied","Data":"a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec"} Mar 18 09:17:16 crc kubenswrapper[4778]: I0318 09:17:15.999125 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerStarted","Data":"9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187"} Mar 18 09:17:16 crc kubenswrapper[4778]: I0318 09:17:16.033179 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sxg2m" podStartSLOduration=2.595827277 podStartE2EDuration="5.033151412s" podCreationTimestamp="2026-03-18 09:17:11 +0000 UTC" firstStartedPulling="2026-03-18 09:17:12.96162923 +0000 UTC m=+899.536374070" lastFinishedPulling="2026-03-18 09:17:15.398953325 +0000 UTC m=+901.973698205" observedRunningTime="2026-03-18 09:17:16.025512982 +0000 UTC m=+902.600257912" watchObservedRunningTime="2026-03-18 09:17:16.033151412 +0000 UTC m=+902.607896292" Mar 18 09:17:16 crc kubenswrapper[4778]: I0318 09:17:16.663056 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:17:18 crc kubenswrapper[4778]: I0318 09:17:18.116566 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:18 crc kubenswrapper[4778]: I0318 09:17:18.116665 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:18 crc kubenswrapper[4778]: I0318 09:17:18.186020 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:19 crc kubenswrapper[4778]: I0318 09:17:19.086514 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:19 crc kubenswrapper[4778]: I0318 09:17:19.372535 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbbb4"] Mar 18 09:17:19 crc kubenswrapper[4778]: I0318 09:17:19.788950 4778 scope.go:117] "RemoveContainer" containerID="c5bd546fb47bde264ad4459aced4ba49381ccd5bb127c64ac227483b8bb621c0" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.032180 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cbbb4" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="registry-server" containerID="cri-o://a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588" gracePeriod=2 Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.505152 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.535540 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.535647 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.558648 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-catalog-content\") pod \"e437a76c-331a-422b-aafb-036febcf9e98\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.559471 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmrmm\" (UniqueName: \"kubernetes.io/projected/e437a76c-331a-422b-aafb-036febcf9e98-kube-api-access-bmrmm\") pod \"e437a76c-331a-422b-aafb-036febcf9e98\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.559535 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-utilities\") pod \"e437a76c-331a-422b-aafb-036febcf9e98\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.561730 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-utilities" (OuterVolumeSpecName: "utilities") pod "e437a76c-331a-422b-aafb-036febcf9e98" (UID: "e437a76c-331a-422b-aafb-036febcf9e98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.570225 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e437a76c-331a-422b-aafb-036febcf9e98-kube-api-access-bmrmm" (OuterVolumeSpecName: "kube-api-access-bmrmm") pod "e437a76c-331a-422b-aafb-036febcf9e98" (UID: "e437a76c-331a-422b-aafb-036febcf9e98"). InnerVolumeSpecName "kube-api-access-bmrmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.607368 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.610279 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e437a76c-331a-422b-aafb-036febcf9e98" (UID: "e437a76c-331a-422b-aafb-036febcf9e98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.662519 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.662567 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmrmm\" (UniqueName: \"kubernetes.io/projected/e437a76c-331a-422b-aafb-036febcf9e98-kube-api-access-bmrmm\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.662586 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.044008 4778 generic.go:334] "Generic (PLEG): container finished" podID="e437a76c-331a-422b-aafb-036febcf9e98" containerID="a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588" exitCode=0 Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.044127 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbbb4" event={"ID":"e437a76c-331a-422b-aafb-036febcf9e98","Type":"ContainerDied","Data":"a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588"} Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.044112 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.044219 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbbb4" event={"ID":"e437a76c-331a-422b-aafb-036febcf9e98","Type":"ContainerDied","Data":"9dc755ad4dc74d5e7bbf029e4b451ae2af505c3ba4a46b3dc50587de84cf2990"} Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.044255 4778 scope.go:117] "RemoveContainer" containerID="a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.072001 4778 scope.go:117] "RemoveContainer" containerID="3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.085405 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbbb4"] Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.089744 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbbb4"] Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.122271 4778 scope.go:117] "RemoveContainer" containerID="eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.153898 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.166037 4778 scope.go:117] "RemoveContainer" containerID="a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588" Mar 18 09:17:22 crc kubenswrapper[4778]: E0318 09:17:22.168619 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588\": container with ID starting with a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588 not found: ID does not exist" containerID="a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.168667 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588"} err="failed to get container status \"a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588\": rpc error: code = NotFound desc = could not find container \"a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588\": container with ID starting with a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588 not found: ID does not exist" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.168713 4778 scope.go:117] "RemoveContainer" containerID="3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09" Mar 18 09:17:22 crc kubenswrapper[4778]: E0318 09:17:22.169082 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09\": container with ID starting with 3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09 not found: ID does not exist" containerID="3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.169142 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09"} err="failed to get container status \"3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09\": rpc error: code = NotFound desc = could not find container \"3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09\": container with ID starting with 3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09 not found: ID does not exist" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.169174 4778 scope.go:117] "RemoveContainer" containerID="eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd" Mar 18 09:17:22 crc kubenswrapper[4778]: E0318 09:17:22.169600 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd\": container with ID starting with eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd not found: ID does not exist" containerID="eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.169633 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd"} err="failed to get container status \"eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd\": rpc error: code = NotFound desc = could not find container \"eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd\": container with ID starting with eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd not found: ID does not exist" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.194400 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e437a76c-331a-422b-aafb-036febcf9e98" path="/var/lib/kubelet/pods/e437a76c-331a-422b-aafb-036febcf9e98/volumes" Mar 18 09:17:23 crc kubenswrapper[4778]: I0318 09:17:23.371984 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxg2m"] Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.058697 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sxg2m" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="registry-server" containerID="cri-o://9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187" gracePeriod=2 Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.573854 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.606532 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-catalog-content\") pod \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.606581 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-utilities\") pod \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.606697 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pn5q\" (UniqueName: \"kubernetes.io/projected/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-kube-api-access-6pn5q\") pod \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.608087 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-utilities" (OuterVolumeSpecName: "utilities") pod "dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" (UID: "dc0f993f-9cd4-479c-b61f-e2eed4a69c2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.615625 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-kube-api-access-6pn5q" (OuterVolumeSpecName: "kube-api-access-6pn5q") pod "dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" (UID: "dc0f993f-9cd4-479c-b61f-e2eed4a69c2d"). InnerVolumeSpecName "kube-api-access-6pn5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.707539 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.707574 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pn5q\" (UniqueName: \"kubernetes.io/projected/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-kube-api-access-6pn5q\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.068364 4778 generic.go:334] "Generic (PLEG): container finished" podID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerID="9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187" exitCode=0 Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.068444 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerDied","Data":"9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187"} Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.068520 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerDied","Data":"1e5d40c4001d2f6677961b892e3d7f1ae3ee8cd89360a0d7ba77365018ec7e6f"} Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.068587 4778 scope.go:117] "RemoveContainer" containerID="9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.068613 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.099720 4778 scope.go:117] "RemoveContainer" containerID="a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.122899 4778 scope.go:117] "RemoveContainer" containerID="ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.165118 4778 scope.go:117] "RemoveContainer" containerID="9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187" Mar 18 09:17:25 crc kubenswrapper[4778]: E0318 09:17:25.165788 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187\": container with ID starting with 9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187 not found: ID does not exist" containerID="9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.165842 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187"} err="failed to get container status \"9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187\": rpc error: code = NotFound desc = could not find container \"9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187\": container with ID starting with 9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187 not found: ID does not exist" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.165879 4778 scope.go:117] "RemoveContainer" containerID="a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec" Mar 18 09:17:25 crc kubenswrapper[4778]: E0318 09:17:25.166620 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec\": container with ID starting with a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec not found: ID does not exist" containerID="a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.166680 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec"} err="failed to get container status \"a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec\": rpc error: code = NotFound desc = could not find container \"a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec\": container with ID starting with a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec not found: ID does not exist" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.166732 4778 scope.go:117] "RemoveContainer" containerID="ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c" Mar 18 09:17:25 crc kubenswrapper[4778]: E0318 09:17:25.167137 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c\": container with ID starting with ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c not found: ID does not exist" containerID="ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.167188 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c"} err="failed to get container status \"ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c\": rpc error: code = NotFound desc = could not find container \"ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c\": container with ID starting with ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c not found: ID does not exist" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.183255 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" (UID: "dc0f993f-9cd4-479c-b61f-e2eed4a69c2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.215711 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.430959 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxg2m"] Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.438418 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sxg2m"] Mar 18 09:17:26 crc kubenswrapper[4778]: I0318 09:17:26.199880 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" path="/var/lib/kubelet/pods/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d/volumes" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.052836 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-pgsqh" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerName="console" containerID="cri-o://f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0" gracePeriod=15 Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.479380 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pgsqh_5f875d21-ddf2-4d41-8be3-819c8836824a/console/0.log" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.479705 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557056 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-trusted-ca-bundle\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557135 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-console-config\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557189 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7qw9\" (UniqueName: \"kubernetes.io/projected/5f875d21-ddf2-4d41-8be3-819c8836824a-kube-api-access-z7qw9\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557228 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-oauth-serving-cert\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557248 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-service-ca\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557272 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-serving-cert\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557324 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-oauth-config\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.558145 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.558438 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-console-config" (OuterVolumeSpecName: "console-config") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.558496 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-service-ca" (OuterVolumeSpecName: "service-ca") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.559168 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.566700 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.567421 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f875d21-ddf2-4d41-8be3-819c8836824a-kube-api-access-z7qw9" (OuterVolumeSpecName: "kube-api-access-z7qw9") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "kube-api-access-z7qw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.567706 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658447 4778 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658484 4778 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658497 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7qw9\" (UniqueName: \"kubernetes.io/projected/5f875d21-ddf2-4d41-8be3-819c8836824a-kube-api-access-z7qw9\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658511 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658523 4778 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658534 4778 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658546 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.131804 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pgsqh_5f875d21-ddf2-4d41-8be3-819c8836824a/console/0.log" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.131874 4778 generic.go:334] "Generic (PLEG): container finished" podID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerID="f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0" exitCode=2 Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.131910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pgsqh" event={"ID":"5f875d21-ddf2-4d41-8be3-819c8836824a","Type":"ContainerDied","Data":"f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0"} Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.131954 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pgsqh" event={"ID":"5f875d21-ddf2-4d41-8be3-819c8836824a","Type":"ContainerDied","Data":"0526269f3752c495953fd88d5da903a92103220f8039ec4c7dde34390b5f6401"} Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.131961 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.131989 4778 scope.go:117] "RemoveContainer" containerID="f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.170632 4778 scope.go:117] "RemoveContainer" containerID="f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.170834 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pgsqh"] Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.171630 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0\": container with ID starting with f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0 not found: ID does not exist" containerID="f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.171711 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0"} err="failed to get container status \"f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0\": rpc error: code = NotFound desc = could not find container \"f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0\": container with ID starting with f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0 not found: ID does not exist" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.177822 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-pgsqh"] Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376090 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl"] Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376638 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="extract-utilities" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376651 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="extract-utilities" Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376663 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="extract-content" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376669 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="extract-content" Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376678 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="extract-content" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376684 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="extract-content" Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376696 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="registry-server" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376702 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="registry-server" Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376713 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="extract-utilities" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376719 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="extract-utilities" Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376728 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerName="console" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376736 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerName="console" Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376745 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="registry-server" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376751 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="registry-server" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376838 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="registry-server" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376848 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="registry-server" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376859 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerName="console" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.377605 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.380336 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.396732 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl"] Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.473564 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.473647 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.473677 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh6hm\" (UniqueName: \"kubernetes.io/projected/2416fdd2-138d-4320-8ff6-47f621e093a9-kube-api-access-hh6hm\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.574707 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.574757 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh6hm\" (UniqueName: \"kubernetes.io/projected/2416fdd2-138d-4320-8ff6-47f621e093a9-kube-api-access-hh6hm\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.574838 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.575374 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.575499 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.600422 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh6hm\" (UniqueName: \"kubernetes.io/projected/2416fdd2-138d-4320-8ff6-47f621e093a9-kube-api-access-hh6hm\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.697314 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.935506 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl"] Mar 18 09:17:34 crc kubenswrapper[4778]: I0318 09:17:34.141445 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" event={"ID":"2416fdd2-138d-4320-8ff6-47f621e093a9","Type":"ContainerStarted","Data":"ab0f6ddf348551a9eea8ce5154623b62b1e5f456495ce97d032488c506954ea0"} Mar 18 09:17:34 crc kubenswrapper[4778]: I0318 09:17:34.141494 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" event={"ID":"2416fdd2-138d-4320-8ff6-47f621e093a9","Type":"ContainerStarted","Data":"4af378aa95ab5264d0f168d006956f0deb2cbc8707b67b859997751f0ca58b65"} Mar 18 09:17:34 crc kubenswrapper[4778]: I0318 09:17:34.196090 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" path="/var/lib/kubelet/pods/5f875d21-ddf2-4d41-8be3-819c8836824a/volumes" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.151539 4778 generic.go:334] "Generic (PLEG): container finished" podID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerID="ab0f6ddf348551a9eea8ce5154623b62b1e5f456495ce97d032488c506954ea0" exitCode=0 Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.151614 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" event={"ID":"2416fdd2-138d-4320-8ff6-47f621e093a9","Type":"ContainerDied","Data":"ab0f6ddf348551a9eea8ce5154623b62b1e5f456495ce97d032488c506954ea0"} Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.490339 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hdvpq"] Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.491582 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.506289 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hdvpq"] Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.612758 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-utilities\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.612843 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-catalog-content\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.612887 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcmnt\" (UniqueName: \"kubernetes.io/projected/8418cdea-ff67-4e52-acf8-39176b7f0cb6-kube-api-access-dcmnt\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.714335 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-utilities\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.714403 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-catalog-content\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.714434 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcmnt\" (UniqueName: \"kubernetes.io/projected/8418cdea-ff67-4e52-acf8-39176b7f0cb6-kube-api-access-dcmnt\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.714970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-utilities\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.715011 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-catalog-content\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.747229 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcmnt\" (UniqueName: \"kubernetes.io/projected/8418cdea-ff67-4e52-acf8-39176b7f0cb6-kube-api-access-dcmnt\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.859638 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:36 crc kubenswrapper[4778]: I0318 09:17:36.340294 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hdvpq"] Mar 18 09:17:37 crc kubenswrapper[4778]: I0318 09:17:37.167290 4778 generic.go:334] "Generic (PLEG): container finished" podID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerID="20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543" exitCode=0 Mar 18 09:17:37 crc kubenswrapper[4778]: I0318 09:17:37.167375 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerDied","Data":"20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543"} Mar 18 09:17:37 crc kubenswrapper[4778]: I0318 09:17:37.167646 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerStarted","Data":"a12438818ab44535475b9d63ec863fde1df42e8be31733d20b680eaccab2c7df"} Mar 18 09:17:37 crc kubenswrapper[4778]: I0318 09:17:37.173879 4778 generic.go:334] "Generic (PLEG): container finished" podID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerID="b5f3877e0e967c7533d9431944ceb301fe3aefbefe8a0404e2d89e1b25820483" exitCode=0 Mar 18 09:17:37 crc kubenswrapper[4778]: I0318 09:17:37.173910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" event={"ID":"2416fdd2-138d-4320-8ff6-47f621e093a9","Type":"ContainerDied","Data":"b5f3877e0e967c7533d9431944ceb301fe3aefbefe8a0404e2d89e1b25820483"} Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.187580 4778 generic.go:334] "Generic (PLEG): container finished" podID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerID="a8efadf3c7201d7501041a28e32c15d23bcfe705d0dc7e9826ae666b7fcc554b" exitCode=0 Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.201617 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" event={"ID":"2416fdd2-138d-4320-8ff6-47f621e093a9","Type":"ContainerDied","Data":"a8efadf3c7201d7501041a28e32c15d23bcfe705d0dc7e9826ae666b7fcc554b"} Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.704508 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m9755"] Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.706903 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.717426 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9755"] Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.862778 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlzdz\" (UniqueName: \"kubernetes.io/projected/5c0788a7-7426-4545-8426-1170b75287d7-kube-api-access-wlzdz\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.862855 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-utilities\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.862934 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-catalog-content\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.964435 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlzdz\" (UniqueName: \"kubernetes.io/projected/5c0788a7-7426-4545-8426-1170b75287d7-kube-api-access-wlzdz\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.964508 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-utilities\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.964544 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-catalog-content\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.965121 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-utilities\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.965163 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-catalog-content\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.989082 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlzdz\" (UniqueName: \"kubernetes.io/projected/5c0788a7-7426-4545-8426-1170b75287d7-kube-api-access-wlzdz\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.028420 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.228301 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerStarted","Data":"6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff"} Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.319090 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9755"] Mar 18 09:17:39 crc kubenswrapper[4778]: W0318 09:17:39.345685 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c0788a7_7426_4545_8426_1170b75287d7.slice/crio-b4986a5669e363b7c8dbe637152b9e988c9b8b567af7a16fb425dd9d24fbd246 WatchSource:0}: Error finding container b4986a5669e363b7c8dbe637152b9e988c9b8b567af7a16fb425dd9d24fbd246: Status 404 returned error can't find the container with id b4986a5669e363b7c8dbe637152b9e988c9b8b567af7a16fb425dd9d24fbd246 Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.490575 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.575505 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-bundle\") pod \"2416fdd2-138d-4320-8ff6-47f621e093a9\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.575566 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-util\") pod \"2416fdd2-138d-4320-8ff6-47f621e093a9\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.575624 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh6hm\" (UniqueName: \"kubernetes.io/projected/2416fdd2-138d-4320-8ff6-47f621e093a9-kube-api-access-hh6hm\") pod \"2416fdd2-138d-4320-8ff6-47f621e093a9\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.576623 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-bundle" (OuterVolumeSpecName: "bundle") pod "2416fdd2-138d-4320-8ff6-47f621e093a9" (UID: "2416fdd2-138d-4320-8ff6-47f621e093a9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.581777 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2416fdd2-138d-4320-8ff6-47f621e093a9-kube-api-access-hh6hm" (OuterVolumeSpecName: "kube-api-access-hh6hm") pod "2416fdd2-138d-4320-8ff6-47f621e093a9" (UID: "2416fdd2-138d-4320-8ff6-47f621e093a9"). InnerVolumeSpecName "kube-api-access-hh6hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.677312 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.677373 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh6hm\" (UniqueName: \"kubernetes.io/projected/2416fdd2-138d-4320-8ff6-47f621e093a9-kube-api-access-hh6hm\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.847308 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-util" (OuterVolumeSpecName: "util") pod "2416fdd2-138d-4320-8ff6-47f621e093a9" (UID: "2416fdd2-138d-4320-8ff6-47f621e093a9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.880964 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-util\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.236866 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" event={"ID":"2416fdd2-138d-4320-8ff6-47f621e093a9","Type":"ContainerDied","Data":"4af378aa95ab5264d0f168d006956f0deb2cbc8707b67b859997751f0ca58b65"} Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.236933 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4af378aa95ab5264d0f168d006956f0deb2cbc8707b67b859997751f0ca58b65" Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.237046 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.239466 4778 generic.go:334] "Generic (PLEG): container finished" podID="5c0788a7-7426-4545-8426-1170b75287d7" containerID="92fe5901b0aca7f6a3ebd767beba9a53c2021f0beb5e9b587ad8a027aeada4c3" exitCode=0 Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.239542 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9755" event={"ID":"5c0788a7-7426-4545-8426-1170b75287d7","Type":"ContainerDied","Data":"92fe5901b0aca7f6a3ebd767beba9a53c2021f0beb5e9b587ad8a027aeada4c3"} Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.239566 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9755" event={"ID":"5c0788a7-7426-4545-8426-1170b75287d7","Type":"ContainerStarted","Data":"b4986a5669e363b7c8dbe637152b9e988c9b8b567af7a16fb425dd9d24fbd246"} Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.243847 4778 generic.go:334] "Generic (PLEG): container finished" podID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerID="6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff" exitCode=0 Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.243890 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerDied","Data":"6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff"} Mar 18 09:17:41 crc kubenswrapper[4778]: I0318 09:17:41.255385 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerStarted","Data":"8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e"} Mar 18 09:17:41 crc kubenswrapper[4778]: I0318 09:17:41.283651 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hdvpq" podStartSLOduration=2.68115167 podStartE2EDuration="6.283621536s" podCreationTimestamp="2026-03-18 09:17:35 +0000 UTC" firstStartedPulling="2026-03-18 09:17:37.16851597 +0000 UTC m=+923.743260800" lastFinishedPulling="2026-03-18 09:17:40.770985786 +0000 UTC m=+927.345730666" observedRunningTime="2026-03-18 09:17:41.280391828 +0000 UTC m=+927.855136668" watchObservedRunningTime="2026-03-18 09:17:41.283621536 +0000 UTC m=+927.858366416" Mar 18 09:17:43 crc kubenswrapper[4778]: I0318 09:17:43.270803 4778 generic.go:334] "Generic (PLEG): container finished" podID="5c0788a7-7426-4545-8426-1170b75287d7" containerID="f08dc64e887f7d353c1fc7e71e4a22ef60d30f4e79ac6175a3b59c6d9cbf86f1" exitCode=0 Mar 18 09:17:43 crc kubenswrapper[4778]: I0318 09:17:43.272094 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9755" event={"ID":"5c0788a7-7426-4545-8426-1170b75287d7","Type":"ContainerDied","Data":"f08dc64e887f7d353c1fc7e71e4a22ef60d30f4e79ac6175a3b59c6d9cbf86f1"} Mar 18 09:17:44 crc kubenswrapper[4778]: I0318 09:17:44.281217 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9755" event={"ID":"5c0788a7-7426-4545-8426-1170b75287d7","Type":"ContainerStarted","Data":"6a2f0efcf9f152b1a4865f9e79cc97bc201491960eaf4a19d83789511f3642f9"} Mar 18 09:17:44 crc kubenswrapper[4778]: I0318 09:17:44.321637 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m9755" podStartSLOduration=2.905558879 podStartE2EDuration="6.321616533s" podCreationTimestamp="2026-03-18 09:17:38 +0000 UTC" firstStartedPulling="2026-03-18 09:17:40.241801753 +0000 UTC m=+926.816546593" lastFinishedPulling="2026-03-18 09:17:43.657859407 +0000 UTC m=+930.232604247" observedRunningTime="2026-03-18 09:17:44.317810629 +0000 UTC m=+930.892555459" watchObservedRunningTime="2026-03-18 09:17:44.321616533 +0000 UTC m=+930.896361373" Mar 18 09:17:45 crc kubenswrapper[4778]: I0318 09:17:45.860658 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:45 crc kubenswrapper[4778]: I0318 09:17:45.860718 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:46 crc kubenswrapper[4778]: I0318 09:17:46.913732 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hdvpq" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="registry-server" probeResult="failure" output=< Mar 18 09:17:46 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:17:46 crc kubenswrapper[4778]: > Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.702536 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx"] Mar 18 09:17:47 crc kubenswrapper[4778]: E0318 09:17:47.702836 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="pull" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.702861 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="pull" Mar 18 09:17:47 crc kubenswrapper[4778]: E0318 09:17:47.702897 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="util" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.702908 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="util" Mar 18 09:17:47 crc kubenswrapper[4778]: E0318 09:17:47.702933 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="extract" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.702943 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="extract" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.703072 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="extract" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.703604 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.707495 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.707585 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7w868" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.707654 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.707495 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.707867 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.716253 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx"] Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.790954 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/721ee07f-fded-43ab-9bb7-2e4e56c98515-apiservice-cert\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.791037 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/721ee07f-fded-43ab-9bb7-2e4e56c98515-webhook-cert\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.791062 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf848\" (UniqueName: \"kubernetes.io/projected/721ee07f-fded-43ab-9bb7-2e4e56c98515-kube-api-access-zf848\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.892658 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/721ee07f-fded-43ab-9bb7-2e4e56c98515-apiservice-cert\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.892754 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/721ee07f-fded-43ab-9bb7-2e4e56c98515-webhook-cert\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.892786 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf848\" (UniqueName: \"kubernetes.io/projected/721ee07f-fded-43ab-9bb7-2e4e56c98515-kube-api-access-zf848\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.901304 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/721ee07f-fded-43ab-9bb7-2e4e56c98515-apiservice-cert\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.904957 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/721ee07f-fded-43ab-9bb7-2e4e56c98515-webhook-cert\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.909163 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf848\" (UniqueName: \"kubernetes.io/projected/721ee07f-fded-43ab-9bb7-2e4e56c98515-kube-api-access-zf848\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.975109 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr"] Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.976605 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.979581 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.979853 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7897r" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.980692 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.004232 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr"] Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.022045 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.095323 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bggwp\" (UniqueName: \"kubernetes.io/projected/75885bb8-adce-4801-8941-75042ab330ea-kube-api-access-bggwp\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.095414 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75885bb8-adce-4801-8941-75042ab330ea-webhook-cert\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.095452 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75885bb8-adce-4801-8941-75042ab330ea-apiservice-cert\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.196740 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bggwp\" (UniqueName: \"kubernetes.io/projected/75885bb8-adce-4801-8941-75042ab330ea-kube-api-access-bggwp\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.196810 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75885bb8-adce-4801-8941-75042ab330ea-webhook-cert\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.196841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75885bb8-adce-4801-8941-75042ab330ea-apiservice-cert\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.204271 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75885bb8-adce-4801-8941-75042ab330ea-apiservice-cert\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.206740 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75885bb8-adce-4801-8941-75042ab330ea-webhook-cert\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.218555 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bggwp\" (UniqueName: \"kubernetes.io/projected/75885bb8-adce-4801-8941-75042ab330ea-kube-api-access-bggwp\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.293264 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.353650 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx"] Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.748404 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr"] Mar 18 09:17:48 crc kubenswrapper[4778]: W0318 09:17:48.756943 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75885bb8_adce_4801_8941_75042ab330ea.slice/crio-57866cc4dc84d8985bce93348bd1a0f9aa43fcb7e969b623cb71f83e2013cab8 WatchSource:0}: Error finding container 57866cc4dc84d8985bce93348bd1a0f9aa43fcb7e969b623cb71f83e2013cab8: Status 404 returned error can't find the container with id 57866cc4dc84d8985bce93348bd1a0f9aa43fcb7e969b623cb71f83e2013cab8 Mar 18 09:17:49 crc kubenswrapper[4778]: I0318 09:17:49.028777 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:49 crc kubenswrapper[4778]: I0318 09:17:49.028823 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:49 crc kubenswrapper[4778]: I0318 09:17:49.083125 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:49 crc kubenswrapper[4778]: I0318 09:17:49.315612 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" event={"ID":"721ee07f-fded-43ab-9bb7-2e4e56c98515","Type":"ContainerStarted","Data":"dd18d307601bf07f91a1512d1b4c89016803e2fcd9a0ede6a6fbd3ecd038c2b4"} Mar 18 09:17:49 crc kubenswrapper[4778]: I0318 09:17:49.316758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" event={"ID":"75885bb8-adce-4801-8941-75042ab330ea","Type":"ContainerStarted","Data":"57866cc4dc84d8985bce93348bd1a0f9aa43fcb7e969b623cb71f83e2013cab8"} Mar 18 09:17:49 crc kubenswrapper[4778]: I0318 09:17:49.377807 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:51 crc kubenswrapper[4778]: I0318 09:17:51.678208 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m9755"] Mar 18 09:17:51 crc kubenswrapper[4778]: I0318 09:17:51.680068 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m9755" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="registry-server" containerID="cri-o://6a2f0efcf9f152b1a4865f9e79cc97bc201491960eaf4a19d83789511f3642f9" gracePeriod=2 Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.358478 4778 generic.go:334] "Generic (PLEG): container finished" podID="5c0788a7-7426-4545-8426-1170b75287d7" containerID="6a2f0efcf9f152b1a4865f9e79cc97bc201491960eaf4a19d83789511f3642f9" exitCode=0 Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.358697 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9755" event={"ID":"5c0788a7-7426-4545-8426-1170b75287d7","Type":"ContainerDied","Data":"6a2f0efcf9f152b1a4865f9e79cc97bc201491960eaf4a19d83789511f3642f9"} Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.502641 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.570079 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlzdz\" (UniqueName: \"kubernetes.io/projected/5c0788a7-7426-4545-8426-1170b75287d7-kube-api-access-wlzdz\") pod \"5c0788a7-7426-4545-8426-1170b75287d7\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.570142 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-catalog-content\") pod \"5c0788a7-7426-4545-8426-1170b75287d7\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.570224 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-utilities\") pod \"5c0788a7-7426-4545-8426-1170b75287d7\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.571126 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-utilities" (OuterVolumeSpecName: "utilities") pod "5c0788a7-7426-4545-8426-1170b75287d7" (UID: "5c0788a7-7426-4545-8426-1170b75287d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.579853 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0788a7-7426-4545-8426-1170b75287d7-kube-api-access-wlzdz" (OuterVolumeSpecName: "kube-api-access-wlzdz") pod "5c0788a7-7426-4545-8426-1170b75287d7" (UID: "5c0788a7-7426-4545-8426-1170b75287d7"). InnerVolumeSpecName "kube-api-access-wlzdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.627551 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c0788a7-7426-4545-8426-1170b75287d7" (UID: "5c0788a7-7426-4545-8426-1170b75287d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.672337 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.672408 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlzdz\" (UniqueName: \"kubernetes.io/projected/5c0788a7-7426-4545-8426-1170b75287d7-kube-api-access-wlzdz\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.672424 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.371585 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9755" event={"ID":"5c0788a7-7426-4545-8426-1170b75287d7","Type":"ContainerDied","Data":"b4986a5669e363b7c8dbe637152b9e988c9b8b567af7a16fb425dd9d24fbd246"} Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.371657 4778 scope.go:117] "RemoveContainer" containerID="6a2f0efcf9f152b1a4865f9e79cc97bc201491960eaf4a19d83789511f3642f9" Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.371682 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.375295 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" event={"ID":"721ee07f-fded-43ab-9bb7-2e4e56c98515","Type":"ContainerStarted","Data":"537efb928acc7327e2c8e054af18ecc834e18d233ce68ca00d26a2794beeb9b5"} Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.375623 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.398257 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" podStartSLOduration=2.451582765 podStartE2EDuration="6.39824016s" podCreationTimestamp="2026-03-18 09:17:47 +0000 UTC" firstStartedPulling="2026-03-18 09:17:48.349259555 +0000 UTC m=+934.924004395" lastFinishedPulling="2026-03-18 09:17:52.29591693 +0000 UTC m=+938.870661790" observedRunningTime="2026-03-18 09:17:53.395558017 +0000 UTC m=+939.970302867" watchObservedRunningTime="2026-03-18 09:17:53.39824016 +0000 UTC m=+939.972985000" Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.411604 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m9755"] Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.415275 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m9755"] Mar 18 09:17:54 crc kubenswrapper[4778]: I0318 09:17:54.197314 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c0788a7-7426-4545-8426-1170b75287d7" path="/var/lib/kubelet/pods/5c0788a7-7426-4545-8426-1170b75287d7/volumes" Mar 18 09:17:54 crc kubenswrapper[4778]: I0318 09:17:54.360424 4778 scope.go:117] "RemoveContainer" containerID="f08dc64e887f7d353c1fc7e71e4a22ef60d30f4e79ac6175a3b59c6d9cbf86f1" Mar 18 09:17:54 crc kubenswrapper[4778]: I0318 09:17:54.422232 4778 scope.go:117] "RemoveContainer" containerID="92fe5901b0aca7f6a3ebd767beba9a53c2021f0beb5e9b587ad8a027aeada4c3" Mar 18 09:17:55 crc kubenswrapper[4778]: I0318 09:17:55.395020 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" event={"ID":"75885bb8-adce-4801-8941-75042ab330ea","Type":"ContainerStarted","Data":"b0b09fbb86fb64d07e560329726f55daf10a4187b5d9f1ddb7228a41c2522d49"} Mar 18 09:17:55 crc kubenswrapper[4778]: I0318 09:17:55.395738 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:55 crc kubenswrapper[4778]: I0318 09:17:55.947467 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:55 crc kubenswrapper[4778]: I0318 09:17:55.974629 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" podStartSLOduration=3.271614058 podStartE2EDuration="8.974594102s" podCreationTimestamp="2026-03-18 09:17:47 +0000 UTC" firstStartedPulling="2026-03-18 09:17:48.759630037 +0000 UTC m=+935.334374877" lastFinishedPulling="2026-03-18 09:17:54.462610081 +0000 UTC m=+941.037354921" observedRunningTime="2026-03-18 09:17:55.423486419 +0000 UTC m=+941.998231269" watchObservedRunningTime="2026-03-18 09:17:55.974594102 +0000 UTC m=+942.549338962" Mar 18 09:17:55 crc kubenswrapper[4778]: I0318 09:17:55.999347 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.077442 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hdvpq"] Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.409600 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hdvpq" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="registry-server" containerID="cri-o://8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e" gracePeriod=2 Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.787340 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.854769 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-utilities\") pod \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.854953 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-catalog-content\") pod \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.855029 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcmnt\" (UniqueName: \"kubernetes.io/projected/8418cdea-ff67-4e52-acf8-39176b7f0cb6-kube-api-access-dcmnt\") pod \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.856239 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-utilities" (OuterVolumeSpecName: "utilities") pod "8418cdea-ff67-4e52-acf8-39176b7f0cb6" (UID: "8418cdea-ff67-4e52-acf8-39176b7f0cb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.863177 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8418cdea-ff67-4e52-acf8-39176b7f0cb6-kube-api-access-dcmnt" (OuterVolumeSpecName: "kube-api-access-dcmnt") pod "8418cdea-ff67-4e52-acf8-39176b7f0cb6" (UID: "8418cdea-ff67-4e52-acf8-39176b7f0cb6"). InnerVolumeSpecName "kube-api-access-dcmnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.957069 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.957119 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcmnt\" (UniqueName: \"kubernetes.io/projected/8418cdea-ff67-4e52-acf8-39176b7f0cb6-kube-api-access-dcmnt\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.020605 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8418cdea-ff67-4e52-acf8-39176b7f0cb6" (UID: "8418cdea-ff67-4e52-acf8-39176b7f0cb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.058757 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.421479 4778 generic.go:334] "Generic (PLEG): container finished" podID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerID="8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e" exitCode=0 Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.421529 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerDied","Data":"8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e"} Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.421570 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.421593 4778 scope.go:117] "RemoveContainer" containerID="8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.421573 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerDied","Data":"a12438818ab44535475b9d63ec863fde1df42e8be31733d20b680eaccab2c7df"} Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.451379 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hdvpq"] Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.454045 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hdvpq"] Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.457187 4778 scope.go:117] "RemoveContainer" containerID="6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.482352 4778 scope.go:117] "RemoveContainer" containerID="20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.516954 4778 scope.go:117] "RemoveContainer" containerID="8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e" Mar 18 09:17:58 crc kubenswrapper[4778]: E0318 09:17:58.518643 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e\": container with ID starting with 8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e not found: ID does not exist" containerID="8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.518735 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e"} err="failed to get container status \"8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e\": rpc error: code = NotFound desc = could not find container \"8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e\": container with ID starting with 8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e not found: ID does not exist" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.518930 4778 scope.go:117] "RemoveContainer" containerID="6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff" Mar 18 09:17:58 crc kubenswrapper[4778]: E0318 09:17:58.519640 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff\": container with ID starting with 6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff not found: ID does not exist" containerID="6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.519688 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff"} err="failed to get container status \"6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff\": rpc error: code = NotFound desc = could not find container \"6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff\": container with ID starting with 6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff not found: ID does not exist" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.519716 4778 scope.go:117] "RemoveContainer" containerID="20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543" Mar 18 09:17:58 crc kubenswrapper[4778]: E0318 09:17:58.520036 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543\": container with ID starting with 20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543 not found: ID does not exist" containerID="20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.520055 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543"} err="failed to get container status \"20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543\": rpc error: code = NotFound desc = could not find container \"20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543\": container with ID starting with 20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543 not found: ID does not exist" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.144831 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563758-zslfz"] Mar 18 09:18:00 crc kubenswrapper[4778]: E0318 09:18:00.145801 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="extract-utilities" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.145822 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="extract-utilities" Mar 18 09:18:00 crc kubenswrapper[4778]: E0318 09:18:00.145842 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="extract-content" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.145851 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="extract-content" Mar 18 09:18:00 crc kubenswrapper[4778]: E0318 09:18:00.145868 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.145877 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4778]: E0318 09:18:00.145899 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="extract-utilities" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.145907 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="extract-utilities" Mar 18 09:18:00 crc kubenswrapper[4778]: E0318 09:18:00.145926 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.145935 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4778]: E0318 09:18:00.145945 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="extract-content" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.145953 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="extract-content" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.146090 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.146104 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.146725 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.148550 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.152874 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.155607 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.161689 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563758-zslfz"] Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.186529 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtggs\" (UniqueName: \"kubernetes.io/projected/c70faab0-9f07-4452-a873-bcb59d28b7a8-kube-api-access-jtggs\") pod \"auto-csr-approver-29563758-zslfz\" (UID: \"c70faab0-9f07-4452-a873-bcb59d28b7a8\") " pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.196115 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" path="/var/lib/kubelet/pods/8418cdea-ff67-4e52-acf8-39176b7f0cb6/volumes" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.288514 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtggs\" (UniqueName: \"kubernetes.io/projected/c70faab0-9f07-4452-a873-bcb59d28b7a8-kube-api-access-jtggs\") pod \"auto-csr-approver-29563758-zslfz\" (UID: \"c70faab0-9f07-4452-a873-bcb59d28b7a8\") " pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.314532 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtggs\" (UniqueName: \"kubernetes.io/projected/c70faab0-9f07-4452-a873-bcb59d28b7a8-kube-api-access-jtggs\") pod \"auto-csr-approver-29563758-zslfz\" (UID: \"c70faab0-9f07-4452-a873-bcb59d28b7a8\") " pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.468683 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.819917 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563758-zslfz"] Mar 18 09:18:01 crc kubenswrapper[4778]: I0318 09:18:01.446415 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563758-zslfz" event={"ID":"c70faab0-9f07-4452-a873-bcb59d28b7a8","Type":"ContainerStarted","Data":"43a583205e8901d1305cfd0347ba2c715870319698b6b5c1fcb4dff75dd395ea"} Mar 18 09:18:02 crc kubenswrapper[4778]: I0318 09:18:02.454431 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563758-zslfz" event={"ID":"c70faab0-9f07-4452-a873-bcb59d28b7a8","Type":"ContainerStarted","Data":"392df7bab826882632f84664710c42da5399ea661a1dbfcd15aec0ad5d248553"} Mar 18 09:18:02 crc kubenswrapper[4778]: I0318 09:18:02.477940 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563758-zslfz" podStartSLOduration=1.406547748 podStartE2EDuration="2.47791519s" podCreationTimestamp="2026-03-18 09:18:00 +0000 UTC" firstStartedPulling="2026-03-18 09:18:00.826367699 +0000 UTC m=+947.401112539" lastFinishedPulling="2026-03-18 09:18:01.897735131 +0000 UTC m=+948.472479981" observedRunningTime="2026-03-18 09:18:02.471600898 +0000 UTC m=+949.046345748" watchObservedRunningTime="2026-03-18 09:18:02.47791519 +0000 UTC m=+949.052660020" Mar 18 09:18:03 crc kubenswrapper[4778]: I0318 09:18:03.463710 4778 generic.go:334] "Generic (PLEG): container finished" podID="c70faab0-9f07-4452-a873-bcb59d28b7a8" containerID="392df7bab826882632f84664710c42da5399ea661a1dbfcd15aec0ad5d248553" exitCode=0 Mar 18 09:18:03 crc kubenswrapper[4778]: I0318 09:18:03.463840 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563758-zslfz" event={"ID":"c70faab0-9f07-4452-a873-bcb59d28b7a8","Type":"ContainerDied","Data":"392df7bab826882632f84664710c42da5399ea661a1dbfcd15aec0ad5d248553"} Mar 18 09:18:04 crc kubenswrapper[4778]: I0318 09:18:04.764620 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:04 crc kubenswrapper[4778]: I0318 09:18:04.851083 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtggs\" (UniqueName: \"kubernetes.io/projected/c70faab0-9f07-4452-a873-bcb59d28b7a8-kube-api-access-jtggs\") pod \"c70faab0-9f07-4452-a873-bcb59d28b7a8\" (UID: \"c70faab0-9f07-4452-a873-bcb59d28b7a8\") " Mar 18 09:18:04 crc kubenswrapper[4778]: I0318 09:18:04.875480 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70faab0-9f07-4452-a873-bcb59d28b7a8-kube-api-access-jtggs" (OuterVolumeSpecName: "kube-api-access-jtggs") pod "c70faab0-9f07-4452-a873-bcb59d28b7a8" (UID: "c70faab0-9f07-4452-a873-bcb59d28b7a8"). InnerVolumeSpecName "kube-api-access-jtggs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:18:04 crc kubenswrapper[4778]: I0318 09:18:04.953102 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtggs\" (UniqueName: \"kubernetes.io/projected/c70faab0-9f07-4452-a873-bcb59d28b7a8-kube-api-access-jtggs\") on node \"crc\" DevicePath \"\"" Mar 18 09:18:05 crc kubenswrapper[4778]: I0318 09:18:05.478531 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563758-zslfz" event={"ID":"c70faab0-9f07-4452-a873-bcb59d28b7a8","Type":"ContainerDied","Data":"43a583205e8901d1305cfd0347ba2c715870319698b6b5c1fcb4dff75dd395ea"} Mar 18 09:18:05 crc kubenswrapper[4778]: I0318 09:18:05.478575 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43a583205e8901d1305cfd0347ba2c715870319698b6b5c1fcb4dff75dd395ea" Mar 18 09:18:05 crc kubenswrapper[4778]: I0318 09:18:05.478629 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:05 crc kubenswrapper[4778]: I0318 09:18:05.528672 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563752-h72kq"] Mar 18 09:18:05 crc kubenswrapper[4778]: I0318 09:18:05.531961 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563752-h72kq"] Mar 18 09:18:06 crc kubenswrapper[4778]: I0318 09:18:06.197268 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e614a6-a447-41bc-b7c8-034610af7d59" path="/var/lib/kubelet/pods/57e614a6-a447-41bc-b7c8-034610af7d59/volumes" Mar 18 09:18:08 crc kubenswrapper[4778]: I0318 09:18:08.299618 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:18:19 crc kubenswrapper[4778]: I0318 09:18:19.849317 4778 scope.go:117] "RemoveContainer" containerID="6614d11a5de4463d54d3a021b1144b715f14eddffa1ef95f83bb20fa8f58ca90" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.025846 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.880266 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-g2q8m"] Mar 18 09:18:28 crc kubenswrapper[4778]: E0318 09:18:28.880551 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70faab0-9f07-4452-a873-bcb59d28b7a8" containerName="oc" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.880570 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70faab0-9f07-4452-a873-bcb59d28b7a8" containerName="oc" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.880743 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70faab0-9f07-4452-a873-bcb59d28b7a8" containerName="oc" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.883474 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.886607 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.886955 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.887080 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-25ntf" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.888621 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv"] Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.889489 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.895392 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.900948 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv"] Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.992280 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wd69x"] Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.993184 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wd69x" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.997565 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.997681 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.997763 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-rs6jl" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.997824 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998453 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-reloader\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998500 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-metrics\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998518 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f18e9f0-b3eb-440a-b035-ed8256df5ed9-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jrtjv\" (UID: \"0f18e9f0-b3eb-440a-b035-ed8256df5ed9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998545 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-metrics-certs\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998575 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-sockets\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998590 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsdtt\" (UniqueName: \"kubernetes.io/projected/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-kube-api-access-zsdtt\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998617 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-conf\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998643 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-startup\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998658 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8jr\" (UniqueName: \"kubernetes.io/projected/0f18e9f0-b3eb-440a-b035-ed8256df5ed9-kube-api-access-7k8jr\") pod \"frr-k8s-webhook-server-bcc4b6f68-jrtjv\" (UID: \"0f18e9f0-b3eb-440a-b035-ed8256df5ed9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.027162 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-sv9kd"] Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.028005 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.033438 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.041014 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-sv9kd"] Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.099896 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metallb-excludel2\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.099973 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dn57\" (UniqueName: \"kubernetes.io/projected/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-kube-api-access-5dn57\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100000 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-metrics\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100022 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f18e9f0-b3eb-440a-b035-ed8256df5ed9-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jrtjv\" (UID: \"0f18e9f0-b3eb-440a-b035-ed8256df5ed9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100245 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-metrics-certs\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100290 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metrics-certs\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100387 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-sockets\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100405 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsdtt\" (UniqueName: \"kubernetes.io/projected/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-kube-api-access-zsdtt\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100500 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-conf\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100592 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100622 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xj2p\" (UniqueName: \"kubernetes.io/projected/1c97662e-d673-42c1-a6ad-75865ba2b8b6-kube-api-access-2xj2p\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100900 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-startup\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100924 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8jr\" (UniqueName: \"kubernetes.io/projected/0f18e9f0-b3eb-440a-b035-ed8256df5ed9-kube-api-access-7k8jr\") pod \"frr-k8s-webhook-server-bcc4b6f68-jrtjv\" (UID: \"0f18e9f0-b3eb-440a-b035-ed8256df5ed9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.101716 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-metrics-certs\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100859 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-conf\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.101378 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-sockets\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.101674 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-startup\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.101819 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-reloader\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.101836 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-cert\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.102030 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-reloader\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-metrics\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.105466 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f18e9f0-b3eb-440a-b035-ed8256df5ed9-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jrtjv\" (UID: \"0f18e9f0-b3eb-440a-b035-ed8256df5ed9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.118887 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-metrics-certs\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.122941 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsdtt\" (UniqueName: \"kubernetes.io/projected/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-kube-api-access-zsdtt\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.130952 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8jr\" (UniqueName: \"kubernetes.io/projected/0f18e9f0-b3eb-440a-b035-ed8256df5ed9-kube-api-access-7k8jr\") pod \"frr-k8s-webhook-server-bcc4b6f68-jrtjv\" (UID: \"0f18e9f0-b3eb-440a-b035-ed8256df5ed9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203162 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xj2p\" (UniqueName: \"kubernetes.io/projected/1c97662e-d673-42c1-a6ad-75865ba2b8b6-kube-api-access-2xj2p\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203189 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-metrics-certs\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203247 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-cert\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.203273 4778 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.203324 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist podName:1c97662e-d673-42c1-a6ad-75865ba2b8b6 nodeName:}" failed. No retries permitted until 2026-03-18 09:18:29.703309531 +0000 UTC m=+976.278054371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist") pod "speaker-wd69x" (UID: "1c97662e-d673-42c1-a6ad-75865ba2b8b6") : secret "metallb-memberlist" not found Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.203415 4778 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.203501 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-metrics-certs podName:1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c nodeName:}" failed. No retries permitted until 2026-03-18 09:18:29.703479235 +0000 UTC m=+976.278224155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-metrics-certs") pod "controller-7bb4cc7c98-sv9kd" (UID: "1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c") : secret "controller-certs-secret" not found Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203275 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metallb-excludel2\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203594 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dn57\" (UniqueName: \"kubernetes.io/projected/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-kube-api-access-5dn57\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203668 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metrics-certs\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.203879 4778 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.203918 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metrics-certs podName:1c97662e-d673-42c1-a6ad-75865ba2b8b6 nodeName:}" failed. No retries permitted until 2026-03-18 09:18:29.703907647 +0000 UTC m=+976.278652497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metrics-certs") pod "speaker-wd69x" (UID: "1c97662e-d673-42c1-a6ad-75865ba2b8b6") : secret "speaker-certs-secret" not found Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.204027 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metallb-excludel2\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.205459 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.217760 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-cert\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.221738 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xj2p\" (UniqueName: \"kubernetes.io/projected/1c97662e-d673-42c1-a6ad-75865ba2b8b6-kube-api-access-2xj2p\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.222337 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dn57\" (UniqueName: \"kubernetes.io/projected/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-kube-api-access-5dn57\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.264748 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.286921 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.669758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"1656963afb085398b95b20b2c8ad4561a38a374e4b575785195a5cb57a6b4b19"} Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.706913 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv"] Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.709966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.710009 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-metrics-certs\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.710060 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metrics-certs\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.710328 4778 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.710511 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist podName:1c97662e-d673-42c1-a6ad-75865ba2b8b6 nodeName:}" failed. No retries permitted until 2026-03-18 09:18:30.71049002 +0000 UTC m=+977.285234880 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist") pod "speaker-wd69x" (UID: "1c97662e-d673-42c1-a6ad-75865ba2b8b6") : secret "metallb-memberlist" not found Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.716127 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metrics-certs\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.716180 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-metrics-certs\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: W0318 09:18:29.717776 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f18e9f0_b3eb_440a_b035_ed8256df5ed9.slice/crio-fb0879a587988f0d70b160d9cd7a1d7dfe060ccfd641ea41bdaae0f43d8649fa WatchSource:0}: Error finding container fb0879a587988f0d70b160d9cd7a1d7dfe060ccfd641ea41bdaae0f43d8649fa: Status 404 returned error can't find the container with id fb0879a587988f0d70b160d9cd7a1d7dfe060ccfd641ea41bdaae0f43d8649fa Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.943856 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.242171 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-sv9kd"] Mar 18 09:18:30 crc kubenswrapper[4778]: W0318 09:18:30.269728 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddcd9d2_a5d0_4773_93f5_8eb9c0fff72c.slice/crio-5083ad7f107da57ef1b66367d5702cd70cf0a3452dafb15d7b844ccbd61a363b WatchSource:0}: Error finding container 5083ad7f107da57ef1b66367d5702cd70cf0a3452dafb15d7b844ccbd61a363b: Status 404 returned error can't find the container with id 5083ad7f107da57ef1b66367d5702cd70cf0a3452dafb15d7b844ccbd61a363b Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.676011 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-sv9kd" event={"ID":"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c","Type":"ContainerStarted","Data":"e7e1976f9564b551d2caf4ad3d741907575b2639672bc6fb228e75a6af818ec4"} Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.676062 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-sv9kd" event={"ID":"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c","Type":"ContainerStarted","Data":"6272aa5dc6d96b92cfb8bbcc1298d1985e685e5cd8a966efb1b4fc4defb883ca"} Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.676076 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-sv9kd" event={"ID":"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c","Type":"ContainerStarted","Data":"5083ad7f107da57ef1b66367d5702cd70cf0a3452dafb15d7b844ccbd61a363b"} Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.676836 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.677888 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" event={"ID":"0f18e9f0-b3eb-440a-b035-ed8256df5ed9","Type":"ContainerStarted","Data":"fb0879a587988f0d70b160d9cd7a1d7dfe060ccfd641ea41bdaae0f43d8649fa"} Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.696006 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-sv9kd" podStartSLOduration=1.6959874080000001 podStartE2EDuration="1.695987408s" podCreationTimestamp="2026-03-18 09:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:18:30.694666662 +0000 UTC m=+977.269411512" watchObservedRunningTime="2026-03-18 09:18:30.695987408 +0000 UTC m=+977.270732248" Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.723025 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.741595 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.812992 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wd69x" Mar 18 09:18:31 crc kubenswrapper[4778]: I0318 09:18:31.694491 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wd69x" event={"ID":"1c97662e-d673-42c1-a6ad-75865ba2b8b6","Type":"ContainerStarted","Data":"32ec92ba8d5cf30c9d775a8c04df502ef02fc21a6beae70d88527f6e68468e89"} Mar 18 09:18:31 crc kubenswrapper[4778]: I0318 09:18:31.694562 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wd69x" event={"ID":"1c97662e-d673-42c1-a6ad-75865ba2b8b6","Type":"ContainerStarted","Data":"281c99793b07759f67e5e3a4adb10852e12a93e9ff5202a70926999ecff20cc6"} Mar 18 09:18:31 crc kubenswrapper[4778]: I0318 09:18:31.694573 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wd69x" event={"ID":"1c97662e-d673-42c1-a6ad-75865ba2b8b6","Type":"ContainerStarted","Data":"41bbda6b48a7fd0c4eaede34ee7f01b13522e0aa9640c24765b5205f525f20df"} Mar 18 09:18:31 crc kubenswrapper[4778]: I0318 09:18:31.694816 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wd69x" Mar 18 09:18:31 crc kubenswrapper[4778]: I0318 09:18:31.715758 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wd69x" podStartSLOduration=3.715735354 podStartE2EDuration="3.715735354s" podCreationTimestamp="2026-03-18 09:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:18:31.711884118 +0000 UTC m=+978.286628958" watchObservedRunningTime="2026-03-18 09:18:31.715735354 +0000 UTC m=+978.290480194" Mar 18 09:18:37 crc kubenswrapper[4778]: I0318 09:18:37.740882 4778 generic.go:334] "Generic (PLEG): container finished" podID="5efed87b-ad9c-4703-b3c4-2d6ab8d0883b" containerID="4028e6c25bc19e614400f65d49c2b1ededcd6352f700b801dc89da8753b3257e" exitCode=0 Mar 18 09:18:37 crc kubenswrapper[4778]: I0318 09:18:37.740979 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerDied","Data":"4028e6c25bc19e614400f65d49c2b1ededcd6352f700b801dc89da8753b3257e"} Mar 18 09:18:37 crc kubenswrapper[4778]: I0318 09:18:37.751004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" event={"ID":"0f18e9f0-b3eb-440a-b035-ed8256df5ed9","Type":"ContainerStarted","Data":"33d80fcbe768658eea05319a17b88b75262d3e15c089b226a8d01e293f508f6d"} Mar 18 09:18:37 crc kubenswrapper[4778]: I0318 09:18:37.751501 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:37 crc kubenswrapper[4778]: I0318 09:18:37.791433 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" podStartSLOduration=2.5618729609999997 podStartE2EDuration="9.791406685s" podCreationTimestamp="2026-03-18 09:18:28 +0000 UTC" firstStartedPulling="2026-03-18 09:18:29.719466565 +0000 UTC m=+976.294211415" lastFinishedPulling="2026-03-18 09:18:36.949000289 +0000 UTC m=+983.523745139" observedRunningTime="2026-03-18 09:18:37.786949273 +0000 UTC m=+984.361694153" watchObservedRunningTime="2026-03-18 09:18:37.791406685 +0000 UTC m=+984.366151525" Mar 18 09:18:38 crc kubenswrapper[4778]: I0318 09:18:38.761961 4778 generic.go:334] "Generic (PLEG): container finished" podID="5efed87b-ad9c-4703-b3c4-2d6ab8d0883b" containerID="884d2e9957c49e96ac45cd1886b7ca9e7fcaa2bd9fbd23ff2ecc70afad84b846" exitCode=0 Mar 18 09:18:38 crc kubenswrapper[4778]: I0318 09:18:38.762059 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerDied","Data":"884d2e9957c49e96ac45cd1886b7ca9e7fcaa2bd9fbd23ff2ecc70afad84b846"} Mar 18 09:18:39 crc kubenswrapper[4778]: I0318 09:18:39.772753 4778 generic.go:334] "Generic (PLEG): container finished" podID="5efed87b-ad9c-4703-b3c4-2d6ab8d0883b" containerID="c10bc7206eb6bd274631b482b8196498e8e6de8ffa09c8a6daef2871d80ae65f" exitCode=0 Mar 18 09:18:39 crc kubenswrapper[4778]: I0318 09:18:39.772817 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerDied","Data":"c10bc7206eb6bd274631b482b8196498e8e6de8ffa09c8a6daef2871d80ae65f"} Mar 18 09:18:40 crc kubenswrapper[4778]: I0318 09:18:40.788752 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"3a0199f34a6190d90fb2b310b4578905a6ed7c36423fc454e725ea8982a1a481"} Mar 18 09:18:40 crc kubenswrapper[4778]: I0318 09:18:40.789148 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"3a3d3ca8de6b6bd608a39bbc12994bd285b221f443de2025e2710e1cade996f4"} Mar 18 09:18:40 crc kubenswrapper[4778]: I0318 09:18:40.789160 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"bfcf62580963ed5c78918d581c18981bc84a2ee2d1212aa05aafe76a6db44187"} Mar 18 09:18:40 crc kubenswrapper[4778]: I0318 09:18:40.789171 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"e6717b05aba2987915fb79fbe80e7310de063f714f06090157916d206897bae9"} Mar 18 09:18:40 crc kubenswrapper[4778]: I0318 09:18:40.789180 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"2584a5734ad79e77a7315cd8fff0259394f0fa965c0a8f63ac4ec5e7c5c410a4"} Mar 18 09:18:41 crc kubenswrapper[4778]: I0318 09:18:41.803481 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"89df37fc150fde5e7b3406ece065011b4e08e3652093c7c2be1ec0b316ca5010"} Mar 18 09:18:41 crc kubenswrapper[4778]: I0318 09:18:41.803699 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:41 crc kubenswrapper[4778]: I0318 09:18:41.831302 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-g2q8m" podStartSLOduration=6.246856879 podStartE2EDuration="13.831276126s" podCreationTimestamp="2026-03-18 09:18:28 +0000 UTC" firstStartedPulling="2026-03-18 09:18:29.392878035 +0000 UTC m=+975.967622875" lastFinishedPulling="2026-03-18 09:18:36.977297272 +0000 UTC m=+983.552042122" observedRunningTime="2026-03-18 09:18:41.830806013 +0000 UTC m=+988.405550913" watchObservedRunningTime="2026-03-18 09:18:41.831276126 +0000 UTC m=+988.406021006" Mar 18 09:18:44 crc kubenswrapper[4778]: I0318 09:18:44.265854 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:44 crc kubenswrapper[4778]: I0318 09:18:44.336142 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:49 crc kubenswrapper[4778]: I0318 09:18:49.268887 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:49 crc kubenswrapper[4778]: I0318 09:18:49.291635 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:49 crc kubenswrapper[4778]: I0318 09:18:49.949274 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:50 crc kubenswrapper[4778]: I0318 09:18:50.816426 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wd69x" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.681826 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wq58v"] Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.683547 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.688342 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-bj7kp" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.688864 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.689240 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.729980 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wq58v"] Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.817190 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rsch\" (UniqueName: \"kubernetes.io/projected/af88af30-254e-4fc3-a29c-a27a6c5fc237-kube-api-access-2rsch\") pod \"openstack-operator-index-wq58v\" (UID: \"af88af30-254e-4fc3-a29c-a27a6c5fc237\") " pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.919033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rsch\" (UniqueName: \"kubernetes.io/projected/af88af30-254e-4fc3-a29c-a27a6c5fc237-kube-api-access-2rsch\") pod \"openstack-operator-index-wq58v\" (UID: \"af88af30-254e-4fc3-a29c-a27a6c5fc237\") " pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.947477 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rsch\" (UniqueName: \"kubernetes.io/projected/af88af30-254e-4fc3-a29c-a27a6c5fc237-kube-api-access-2rsch\") pod \"openstack-operator-index-wq58v\" (UID: \"af88af30-254e-4fc3-a29c-a27a6c5fc237\") " pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:54 crc kubenswrapper[4778]: I0318 09:18:54.008322 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:54 crc kubenswrapper[4778]: I0318 09:18:54.441974 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wq58v"] Mar 18 09:18:54 crc kubenswrapper[4778]: W0318 09:18:54.446576 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf88af30_254e_4fc3_a29c_a27a6c5fc237.slice/crio-00c9aec972ccc889c267a8674b92b068b7f18f6f77e302ded3904a531ffc6a65 WatchSource:0}: Error finding container 00c9aec972ccc889c267a8674b92b068b7f18f6f77e302ded3904a531ffc6a65: Status 404 returned error can't find the container with id 00c9aec972ccc889c267a8674b92b068b7f18f6f77e302ded3904a531ffc6a65 Mar 18 09:18:54 crc kubenswrapper[4778]: I0318 09:18:54.898615 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wq58v" event={"ID":"af88af30-254e-4fc3-a29c-a27a6c5fc237","Type":"ContainerStarted","Data":"00c9aec972ccc889c267a8674b92b068b7f18f6f77e302ded3904a531ffc6a65"} Mar 18 09:18:56 crc kubenswrapper[4778]: I0318 09:18:56.456324 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wq58v"] Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.064121 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-v7qxm"] Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.065169 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.079579 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v7qxm"] Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.175742 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d28wd\" (UniqueName: \"kubernetes.io/projected/c508c810-232f-48c1-8d15-bbbb118d2948-kube-api-access-d28wd\") pod \"openstack-operator-index-v7qxm\" (UID: \"c508c810-232f-48c1-8d15-bbbb118d2948\") " pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.277592 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d28wd\" (UniqueName: \"kubernetes.io/projected/c508c810-232f-48c1-8d15-bbbb118d2948-kube-api-access-d28wd\") pod \"openstack-operator-index-v7qxm\" (UID: \"c508c810-232f-48c1-8d15-bbbb118d2948\") " pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.300417 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d28wd\" (UniqueName: \"kubernetes.io/projected/c508c810-232f-48c1-8d15-bbbb118d2948-kube-api-access-d28wd\") pod \"openstack-operator-index-v7qxm\" (UID: \"c508c810-232f-48c1-8d15-bbbb118d2948\") " pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.389910 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.849703 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v7qxm"] Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.920260 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v7qxm" event={"ID":"c508c810-232f-48c1-8d15-bbbb118d2948","Type":"ContainerStarted","Data":"fde3b5949c32154e6d1b4e565f49e7dea0a53d0b4d82148511eacb0596e02c86"} Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.923316 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wq58v" event={"ID":"af88af30-254e-4fc3-a29c-a27a6c5fc237","Type":"ContainerStarted","Data":"ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0"} Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.923634 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wq58v" podUID="af88af30-254e-4fc3-a29c-a27a6c5fc237" containerName="registry-server" containerID="cri-o://ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0" gracePeriod=2 Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.945017 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wq58v" podStartSLOduration=2.648930152 podStartE2EDuration="4.944991118s" podCreationTimestamp="2026-03-18 09:18:53 +0000 UTC" firstStartedPulling="2026-03-18 09:18:54.448361222 +0000 UTC m=+1001.023106062" lastFinishedPulling="2026-03-18 09:18:56.744422178 +0000 UTC m=+1003.319167028" observedRunningTime="2026-03-18 09:18:57.942403207 +0000 UTC m=+1004.517148047" watchObservedRunningTime="2026-03-18 09:18:57.944991118 +0000 UTC m=+1004.519735968" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.320758 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.397274 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rsch\" (UniqueName: \"kubernetes.io/projected/af88af30-254e-4fc3-a29c-a27a6c5fc237-kube-api-access-2rsch\") pod \"af88af30-254e-4fc3-a29c-a27a6c5fc237\" (UID: \"af88af30-254e-4fc3-a29c-a27a6c5fc237\") " Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.404106 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af88af30-254e-4fc3-a29c-a27a6c5fc237-kube-api-access-2rsch" (OuterVolumeSpecName: "kube-api-access-2rsch") pod "af88af30-254e-4fc3-a29c-a27a6c5fc237" (UID: "af88af30-254e-4fc3-a29c-a27a6c5fc237"). InnerVolumeSpecName "kube-api-access-2rsch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.501363 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rsch\" (UniqueName: \"kubernetes.io/projected/af88af30-254e-4fc3-a29c-a27a6c5fc237-kube-api-access-2rsch\") on node \"crc\" DevicePath \"\"" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.932229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v7qxm" event={"ID":"c508c810-232f-48c1-8d15-bbbb118d2948","Type":"ContainerStarted","Data":"b07fb00176255682c76c04b0680ea7bbe25fa81d6391ca7ba238ec1a72cb8051"} Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.933911 4778 generic.go:334] "Generic (PLEG): container finished" podID="af88af30-254e-4fc3-a29c-a27a6c5fc237" containerID="ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0" exitCode=0 Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.933997 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.933993 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wq58v" event={"ID":"af88af30-254e-4fc3-a29c-a27a6c5fc237","Type":"ContainerDied","Data":"ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0"} Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.934120 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wq58v" event={"ID":"af88af30-254e-4fc3-a29c-a27a6c5fc237","Type":"ContainerDied","Data":"00c9aec972ccc889c267a8674b92b068b7f18f6f77e302ded3904a531ffc6a65"} Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.934150 4778 scope.go:117] "RemoveContainer" containerID="ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.953633 4778 scope.go:117] "RemoveContainer" containerID="ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0" Mar 18 09:18:58 crc kubenswrapper[4778]: E0318 09:18:58.954173 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0\": container with ID starting with ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0 not found: ID does not exist" containerID="ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.954243 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0"} err="failed to get container status \"ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0\": rpc error: code = NotFound desc = could not find container \"ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0\": container with ID starting with ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0 not found: ID does not exist" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.961857 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-v7qxm" podStartSLOduration=1.905758721 podStartE2EDuration="1.961839554s" podCreationTimestamp="2026-03-18 09:18:57 +0000 UTC" firstStartedPulling="2026-03-18 09:18:57.868754394 +0000 UTC m=+1004.443499244" lastFinishedPulling="2026-03-18 09:18:57.924835237 +0000 UTC m=+1004.499580077" observedRunningTime="2026-03-18 09:18:58.957629118 +0000 UTC m=+1005.532373958" watchObservedRunningTime="2026-03-18 09:18:58.961839554 +0000 UTC m=+1005.536584394" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.978413 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wq58v"] Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.982076 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wq58v"] Mar 18 09:19:00 crc kubenswrapper[4778]: I0318 09:19:00.147449 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:19:00 crc kubenswrapper[4778]: I0318 09:19:00.148032 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:19:00 crc kubenswrapper[4778]: I0318 09:19:00.197427 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af88af30-254e-4fc3-a29c-a27a6c5fc237" path="/var/lib/kubelet/pods/af88af30-254e-4fc3-a29c-a27a6c5fc237/volumes" Mar 18 09:19:07 crc kubenswrapper[4778]: I0318 09:19:07.390900 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:19:07 crc kubenswrapper[4778]: I0318 09:19:07.391859 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:19:07 crc kubenswrapper[4778]: I0318 09:19:07.483627 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:19:08 crc kubenswrapper[4778]: I0318 09:19:08.050983 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.634788 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb"] Mar 18 09:19:13 crc kubenswrapper[4778]: E0318 09:19:13.635650 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af88af30-254e-4fc3-a29c-a27a6c5fc237" containerName="registry-server" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.635669 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="af88af30-254e-4fc3-a29c-a27a6c5fc237" containerName="registry-server" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.635843 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="af88af30-254e-4fc3-a29c-a27a6c5fc237" containerName="registry-server" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.636857 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.639808 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-hqfxz" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.655005 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb"] Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.736599 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjnq5\" (UniqueName: \"kubernetes.io/projected/bf055ff8-8bbd-4628-a5ad-c765775e8f16-kube-api-access-jjnq5\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.736684 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-bundle\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.736787 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-util\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.838412 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-util\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.838494 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjnq5\" (UniqueName: \"kubernetes.io/projected/bf055ff8-8bbd-4628-a5ad-c765775e8f16-kube-api-access-jjnq5\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.838537 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-bundle\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.839354 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-bundle\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.839582 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-util\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.862362 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjnq5\" (UniqueName: \"kubernetes.io/projected/bf055ff8-8bbd-4628-a5ad-c765775e8f16-kube-api-access-jjnq5\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.955615 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:14 crc kubenswrapper[4778]: I0318 09:19:14.227932 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb"] Mar 18 09:19:15 crc kubenswrapper[4778]: I0318 09:19:15.061148 4778 generic.go:334] "Generic (PLEG): container finished" podID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerID="47836f9ce3326802fb91a0906c882b64a2ee8f615265698a7da9aa2e7af057a3" exitCode=0 Mar 18 09:19:15 crc kubenswrapper[4778]: I0318 09:19:15.061261 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" event={"ID":"bf055ff8-8bbd-4628-a5ad-c765775e8f16","Type":"ContainerDied","Data":"47836f9ce3326802fb91a0906c882b64a2ee8f615265698a7da9aa2e7af057a3"} Mar 18 09:19:15 crc kubenswrapper[4778]: I0318 09:19:15.061587 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" event={"ID":"bf055ff8-8bbd-4628-a5ad-c765775e8f16","Type":"ContainerStarted","Data":"8529cd59e47eae09ff5610bf217b94cdbf4dedff1d056cb9d0c8ab700e2b08b9"} Mar 18 09:19:16 crc kubenswrapper[4778]: I0318 09:19:16.072470 4778 generic.go:334] "Generic (PLEG): container finished" podID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerID="84728ae18eda0b7e5585f55947b66e6ed6fb4b9198db7f6cb581530b764fd135" exitCode=0 Mar 18 09:19:16 crc kubenswrapper[4778]: I0318 09:19:16.072528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" event={"ID":"bf055ff8-8bbd-4628-a5ad-c765775e8f16","Type":"ContainerDied","Data":"84728ae18eda0b7e5585f55947b66e6ed6fb4b9198db7f6cb581530b764fd135"} Mar 18 09:19:17 crc kubenswrapper[4778]: I0318 09:19:17.080737 4778 generic.go:334] "Generic (PLEG): container finished" podID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerID="5b08df49541db16461b6000e0757b929059d014d5332c54dd24d345ac2134d55" exitCode=0 Mar 18 09:19:17 crc kubenswrapper[4778]: I0318 09:19:17.080821 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" event={"ID":"bf055ff8-8bbd-4628-a5ad-c765775e8f16","Type":"ContainerDied","Data":"5b08df49541db16461b6000e0757b929059d014d5332c54dd24d345ac2134d55"} Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.400070 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.508410 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-util\") pod \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.508777 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-bundle\") pod \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.510083 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjnq5\" (UniqueName: \"kubernetes.io/projected/bf055ff8-8bbd-4628-a5ad-c765775e8f16-kube-api-access-jjnq5\") pod \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.510683 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-bundle" (OuterVolumeSpecName: "bundle") pod "bf055ff8-8bbd-4628-a5ad-c765775e8f16" (UID: "bf055ff8-8bbd-4628-a5ad-c765775e8f16"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.518164 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf055ff8-8bbd-4628-a5ad-c765775e8f16-kube-api-access-jjnq5" (OuterVolumeSpecName: "kube-api-access-jjnq5") pod "bf055ff8-8bbd-4628-a5ad-c765775e8f16" (UID: "bf055ff8-8bbd-4628-a5ad-c765775e8f16"). InnerVolumeSpecName "kube-api-access-jjnq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.529598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-util" (OuterVolumeSpecName: "util") pod "bf055ff8-8bbd-4628-a5ad-c765775e8f16" (UID: "bf055ff8-8bbd-4628-a5ad-c765775e8f16"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.611949 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-util\") on node \"crc\" DevicePath \"\"" Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.612013 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.612028 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjnq5\" (UniqueName: \"kubernetes.io/projected/bf055ff8-8bbd-4628-a5ad-c765775e8f16-kube-api-access-jjnq5\") on node \"crc\" DevicePath \"\"" Mar 18 09:19:19 crc kubenswrapper[4778]: I0318 09:19:19.113721 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" event={"ID":"bf055ff8-8bbd-4628-a5ad-c765775e8f16","Type":"ContainerDied","Data":"8529cd59e47eae09ff5610bf217b94cdbf4dedff1d056cb9d0c8ab700e2b08b9"} Mar 18 09:19:19 crc kubenswrapper[4778]: I0318 09:19:19.113773 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8529cd59e47eae09ff5610bf217b94cdbf4dedff1d056cb9d0c8ab700e2b08b9" Mar 18 09:19:19 crc kubenswrapper[4778]: I0318 09:19:19.113857 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.860234 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb"] Mar 18 09:19:25 crc kubenswrapper[4778]: E0318 09:19:25.861418 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="pull" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.861438 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="pull" Mar 18 09:19:25 crc kubenswrapper[4778]: E0318 09:19:25.861465 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="util" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.861471 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="util" Mar 18 09:19:25 crc kubenswrapper[4778]: E0318 09:19:25.861477 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="extract" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.861484 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="extract" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.861601 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="extract" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.862152 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.872689 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-rg2gx" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.889955 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb"] Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.915467 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvwhv\" (UniqueName: \"kubernetes.io/projected/b8267dff-2541-481e-bc64-13eb8d19300b-kube-api-access-dvwhv\") pod \"openstack-operator-controller-init-654f4fc7f7-9d4pb\" (UID: \"b8267dff-2541-481e-bc64-13eb8d19300b\") " pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:19:26 crc kubenswrapper[4778]: I0318 09:19:26.017609 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvwhv\" (UniqueName: \"kubernetes.io/projected/b8267dff-2541-481e-bc64-13eb8d19300b-kube-api-access-dvwhv\") pod \"openstack-operator-controller-init-654f4fc7f7-9d4pb\" (UID: \"b8267dff-2541-481e-bc64-13eb8d19300b\") " pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:19:26 crc kubenswrapper[4778]: I0318 09:19:26.044666 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvwhv\" (UniqueName: \"kubernetes.io/projected/b8267dff-2541-481e-bc64-13eb8d19300b-kube-api-access-dvwhv\") pod \"openstack-operator-controller-init-654f4fc7f7-9d4pb\" (UID: \"b8267dff-2541-481e-bc64-13eb8d19300b\") " pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:19:26 crc kubenswrapper[4778]: I0318 09:19:26.183917 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:19:26 crc kubenswrapper[4778]: I0318 09:19:26.617833 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb"] Mar 18 09:19:27 crc kubenswrapper[4778]: I0318 09:19:27.260266 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" event={"ID":"b8267dff-2541-481e-bc64-13eb8d19300b","Type":"ContainerStarted","Data":"a228eaf5ca5631c1c43cfb3184d487c9befddb1cb8e2cf870064bd86f470aabd"} Mar 18 09:19:30 crc kubenswrapper[4778]: I0318 09:19:30.147568 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:19:30 crc kubenswrapper[4778]: I0318 09:19:30.147956 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:19:32 crc kubenswrapper[4778]: I0318 09:19:32.292597 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" event={"ID":"b8267dff-2541-481e-bc64-13eb8d19300b","Type":"ContainerStarted","Data":"2c37a1d2401dad6db96eaba502ecbd7b30e7afca3591df1484a47815bf556097"} Mar 18 09:19:32 crc kubenswrapper[4778]: I0318 09:19:32.293011 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:19:32 crc kubenswrapper[4778]: I0318 09:19:32.344551 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" podStartSLOduration=1.952699237 podStartE2EDuration="7.344525008s" podCreationTimestamp="2026-03-18 09:19:25 +0000 UTC" firstStartedPulling="2026-03-18 09:19:26.626712694 +0000 UTC m=+1033.201457534" lastFinishedPulling="2026-03-18 09:19:32.018538465 +0000 UTC m=+1038.593283305" observedRunningTime="2026-03-18 09:19:32.334563426 +0000 UTC m=+1038.909308336" watchObservedRunningTime="2026-03-18 09:19:32.344525008 +0000 UTC m=+1038.919269878" Mar 18 09:19:46 crc kubenswrapper[4778]: I0318 09:19:46.197428 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.146037 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563760-nvkp2"] Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.147912 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.147984 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.148056 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.148136 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.148949 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea61f21841c4e8176f6c189293fbb254c950982318bec3d46909c1b0e41c2f38"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.149051 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://ea61f21841c4e8176f6c189293fbb254c950982318bec3d46909c1b0e41c2f38" gracePeriod=600 Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.154097 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.154577 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.154733 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.159099 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563760-nvkp2"] Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.187912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-766z8\" (UniqueName: \"kubernetes.io/projected/bc3bf93e-1b00-4852-b69b-0c8d701f56e3-kube-api-access-766z8\") pod \"auto-csr-approver-29563760-nvkp2\" (UID: \"bc3bf93e-1b00-4852-b69b-0c8d701f56e3\") " pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.289459 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-766z8\" (UniqueName: \"kubernetes.io/projected/bc3bf93e-1b00-4852-b69b-0c8d701f56e3-kube-api-access-766z8\") pod \"auto-csr-approver-29563760-nvkp2\" (UID: \"bc3bf93e-1b00-4852-b69b-0c8d701f56e3\") " pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.318864 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-766z8\" (UniqueName: \"kubernetes.io/projected/bc3bf93e-1b00-4852-b69b-0c8d701f56e3-kube-api-access-766z8\") pod \"auto-csr-approver-29563760-nvkp2\" (UID: \"bc3bf93e-1b00-4852-b69b-0c8d701f56e3\") " pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.468780 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.525987 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="ea61f21841c4e8176f6c189293fbb254c950982318bec3d46909c1b0e41c2f38" exitCode=0 Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.526067 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"ea61f21841c4e8176f6c189293fbb254c950982318bec3d46909c1b0e41c2f38"} Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.526339 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"7154b13c7f4cd2402d0304ed7b86d22cac3e7b544f6222d5ad0d8d8ac0463fe2"} Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.526361 4778 scope.go:117] "RemoveContainer" containerID="2bce801303fcc67febe135eadcc45500477ce5c50a4d9315c95b48aad6bc9b1d" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.965089 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563760-nvkp2"] Mar 18 09:20:01 crc kubenswrapper[4778]: I0318 09:20:01.534544 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" event={"ID":"bc3bf93e-1b00-4852-b69b-0c8d701f56e3","Type":"ContainerStarted","Data":"22f8d1716c9a755ffbb22364f776ec4335946e6e03b2e1fa7f170bfeb4ef8f31"} Mar 18 09:20:02 crc kubenswrapper[4778]: I0318 09:20:02.546253 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" event={"ID":"bc3bf93e-1b00-4852-b69b-0c8d701f56e3","Type":"ContainerStarted","Data":"3c3567d850d5fbfcade4077c9139b7f651174e9261a4f7a1ab2f40e22fce3000"} Mar 18 09:20:02 crc kubenswrapper[4778]: I0318 09:20:02.571763 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" podStartSLOduration=1.374263493 podStartE2EDuration="2.571710558s" podCreationTimestamp="2026-03-18 09:20:00 +0000 UTC" firstStartedPulling="2026-03-18 09:20:00.974938754 +0000 UTC m=+1067.549683594" lastFinishedPulling="2026-03-18 09:20:02.172385819 +0000 UTC m=+1068.747130659" observedRunningTime="2026-03-18 09:20:02.569963681 +0000 UTC m=+1069.144708521" watchObservedRunningTime="2026-03-18 09:20:02.571710558 +0000 UTC m=+1069.146455408" Mar 18 09:20:03 crc kubenswrapper[4778]: I0318 09:20:03.555457 4778 generic.go:334] "Generic (PLEG): container finished" podID="bc3bf93e-1b00-4852-b69b-0c8d701f56e3" containerID="3c3567d850d5fbfcade4077c9139b7f651174e9261a4f7a1ab2f40e22fce3000" exitCode=0 Mar 18 09:20:03 crc kubenswrapper[4778]: I0318 09:20:03.555506 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" event={"ID":"bc3bf93e-1b00-4852-b69b-0c8d701f56e3","Type":"ContainerDied","Data":"3c3567d850d5fbfcade4077c9139b7f651174e9261a4f7a1ab2f40e22fce3000"} Mar 18 09:20:04 crc kubenswrapper[4778]: I0318 09:20:04.924127 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.072665 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-766z8\" (UniqueName: \"kubernetes.io/projected/bc3bf93e-1b00-4852-b69b-0c8d701f56e3-kube-api-access-766z8\") pod \"bc3bf93e-1b00-4852-b69b-0c8d701f56e3\" (UID: \"bc3bf93e-1b00-4852-b69b-0c8d701f56e3\") " Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.081518 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc3bf93e-1b00-4852-b69b-0c8d701f56e3-kube-api-access-766z8" (OuterVolumeSpecName: "kube-api-access-766z8") pod "bc3bf93e-1b00-4852-b69b-0c8d701f56e3" (UID: "bc3bf93e-1b00-4852-b69b-0c8d701f56e3"). InnerVolumeSpecName "kube-api-access-766z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.174631 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-766z8\" (UniqueName: \"kubernetes.io/projected/bc3bf93e-1b00-4852-b69b-0c8d701f56e3-kube-api-access-766z8\") on node \"crc\" DevicePath \"\"" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.347174 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt"] Mar 18 09:20:05 crc kubenswrapper[4778]: E0318 09:20:05.347794 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc3bf93e-1b00-4852-b69b-0c8d701f56e3" containerName="oc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.347811 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3bf93e-1b00-4852-b69b-0c8d701f56e3" containerName="oc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.347967 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc3bf93e-1b00-4852-b69b-0c8d701f56e3" containerName="oc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.348531 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.352026 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5zcs7" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.361038 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.390259 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.391258 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.395830 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-dp9kk" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.408739 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.409513 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.412475 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.416607 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-mm4tx" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.425458 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.434539 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.435665 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.436618 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.437167 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.445655 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8qxnf" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.445858 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-tngp4" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.455982 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.471238 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.472077 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.474836 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-x2sk4" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.479536 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.480547 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.484775 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.484965 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dpc7h" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486271 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq7zm\" (UniqueName: \"kubernetes.io/projected/124dc549-cb2a-4b1c-a610-093cf9b8c05d-kube-api-access-qq7zm\") pod \"horizon-operator-controller-manager-8464cc45fb-x7rnp\" (UID: \"124dc549-cb2a-4b1c-a610-093cf9b8c05d\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486540 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdfx\" (UniqueName: \"kubernetes.io/projected/aceb2f7b-585f-451a-83b8-e673965ada87-kube-api-access-thdfx\") pod \"heat-operator-controller-manager-67dd5f86f5-t5c4w\" (UID: \"aceb2f7b-585f-451a-83b8-e673965ada87\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486578 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vb8t\" (UniqueName: \"kubernetes.io/projected/0526f654-9ddc-4495-bb04-be13e53b6a1b-kube-api-access-6vb8t\") pod \"cinder-operator-controller-manager-8d58dc466-wxftc\" (UID: \"0526f654-9ddc-4495-bb04-be13e53b6a1b\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486609 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj4mr\" (UniqueName: \"kubernetes.io/projected/3390909b-6271-40dd-9662-0710f6866143-kube-api-access-hj4mr\") pod \"barbican-operator-controller-manager-59bc569d95-fsxlt\" (UID: \"3390909b-6271-40dd-9662-0710f6866143\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486629 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486683 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxwjf\" (UniqueName: \"kubernetes.io/projected/710ababb-0bee-441d-8dd0-e6a72ea2b2e3-kube-api-access-kxwjf\") pod \"designate-operator-controller-manager-588d4d986b-7mbx2\" (UID: \"710ababb-0bee-441d-8dd0-e6a72ea2b2e3\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486925 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.498024 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.507061 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.513728 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.514463 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.525579 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.537104 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-6zddf" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.546323 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.547393 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.550881 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-wfl7r" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.566306 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.571334 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" event={"ID":"bc3bf93e-1b00-4852-b69b-0c8d701f56e3","Type":"ContainerDied","Data":"22f8d1716c9a755ffbb22364f776ec4335946e6e03b2e1fa7f170bfeb4ef8f31"} Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.571377 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22f8d1716c9a755ffbb22364f776ec4335946e6e03b2e1fa7f170bfeb4ef8f31" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.571422 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594284 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdfx\" (UniqueName: \"kubernetes.io/projected/aceb2f7b-585f-451a-83b8-e673965ada87-kube-api-access-thdfx\") pod \"heat-operator-controller-manager-67dd5f86f5-t5c4w\" (UID: \"aceb2f7b-585f-451a-83b8-e673965ada87\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594357 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vb8t\" (UniqueName: \"kubernetes.io/projected/0526f654-9ddc-4495-bb04-be13e53b6a1b-kube-api-access-6vb8t\") pod \"cinder-operator-controller-manager-8d58dc466-wxftc\" (UID: \"0526f654-9ddc-4495-bb04-be13e53b6a1b\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594405 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwpvg\" (UniqueName: \"kubernetes.io/projected/b41dbd4a-33dd-4dca-9356-34c740e8063f-kube-api-access-jwpvg\") pod \"glance-operator-controller-manager-79df6bcc97-wb4pc\" (UID: \"b41dbd4a-33dd-4dca-9356-34c740e8063f\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594433 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj4mr\" (UniqueName: \"kubernetes.io/projected/3390909b-6271-40dd-9662-0710f6866143-kube-api-access-hj4mr\") pod \"barbican-operator-controller-manager-59bc569d95-fsxlt\" (UID: \"3390909b-6271-40dd-9662-0710f6866143\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594465 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594500 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxwjf\" (UniqueName: \"kubernetes.io/projected/710ababb-0bee-441d-8dd0-e6a72ea2b2e3-kube-api-access-kxwjf\") pod \"designate-operator-controller-manager-588d4d986b-7mbx2\" (UID: \"710ababb-0bee-441d-8dd0-e6a72ea2b2e3\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594593 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lkm5\" (UniqueName: \"kubernetes.io/projected/66d3bf3a-086c-4340-ba73-209f526fc33c-kube-api-access-5lkm5\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594644 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq7zm\" (UniqueName: \"kubernetes.io/projected/124dc549-cb2a-4b1c-a610-093cf9b8c05d-kube-api-access-qq7zm\") pod \"horizon-operator-controller-manager-8464cc45fb-x7rnp\" (UID: \"124dc549-cb2a-4b1c-a610-093cf9b8c05d\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:05 crc kubenswrapper[4778]: E0318 09:20:05.595517 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:05 crc kubenswrapper[4778]: E0318 09:20:05.595563 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert podName:66d3bf3a-086c-4340-ba73-209f526fc33c nodeName:}" failed. No retries permitted until 2026-03-18 09:20:06.095547236 +0000 UTC m=+1072.670292076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert") pod "infra-operator-controller-manager-7b9c774f96-64c4x" (UID: "66d3bf3a-086c-4340-ba73-209f526fc33c") : secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.598955 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.603635 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.619247 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-qfjfv" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.619453 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-zpc92"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.621111 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.626353 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdfx\" (UniqueName: \"kubernetes.io/projected/aceb2f7b-585f-451a-83b8-e673965ada87-kube-api-access-thdfx\") pod \"heat-operator-controller-manager-67dd5f86f5-t5c4w\" (UID: \"aceb2f7b-585f-451a-83b8-e673965ada87\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.626538 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2xnxj" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.662077 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vb8t\" (UniqueName: \"kubernetes.io/projected/0526f654-9ddc-4495-bb04-be13e53b6a1b-kube-api-access-6vb8t\") pod \"cinder-operator-controller-manager-8d58dc466-wxftc\" (UID: \"0526f654-9ddc-4495-bb04-be13e53b6a1b\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.662901 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq7zm\" (UniqueName: \"kubernetes.io/projected/124dc549-cb2a-4b1c-a610-093cf9b8c05d-kube-api-access-qq7zm\") pod \"horizon-operator-controller-manager-8464cc45fb-x7rnp\" (UID: \"124dc549-cb2a-4b1c-a610-093cf9b8c05d\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.663647 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.664515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.666042 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj4mr\" (UniqueName: \"kubernetes.io/projected/3390909b-6271-40dd-9662-0710f6866143-kube-api-access-hj4mr\") pod \"barbican-operator-controller-manager-59bc569d95-fsxlt\" (UID: \"3390909b-6271-40dd-9662-0710f6866143\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.669689 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxwjf\" (UniqueName: \"kubernetes.io/projected/710ababb-0bee-441d-8dd0-e6a72ea2b2e3-kube-api-access-kxwjf\") pod \"designate-operator-controller-manager-588d4d986b-7mbx2\" (UID: \"710ababb-0bee-441d-8dd0-e6a72ea2b2e3\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.674978 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7wkdx" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.691775 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698282 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f65c\" (UniqueName: \"kubernetes.io/projected/211c991a-9406-4360-aa7f-830be3aa55db-kube-api-access-7f65c\") pod \"manila-operator-controller-manager-55f864c847-zpc92\" (UID: \"211c991a-9406-4360-aa7f-830be3aa55db\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698322 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxrp2\" (UniqueName: \"kubernetes.io/projected/37675366-70a8-4e0b-b92b-f7055547d918-kube-api-access-bxrp2\") pod \"mariadb-operator-controller-manager-67ccfc9778-47sbc\" (UID: \"37675366-70a8-4e0b-b92b-f7055547d918\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698351 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lkm5\" (UniqueName: \"kubernetes.io/projected/66d3bf3a-086c-4340-ba73-209f526fc33c-kube-api-access-5lkm5\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698390 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js847\" (UniqueName: \"kubernetes.io/projected/3c86f76c-1617-45e9-9573-f6fd51803b45-kube-api-access-js847\") pod \"ironic-operator-controller-manager-6f787dddc9-fjjvl\" (UID: \"3c86f76c-1617-45e9-9573-f6fd51803b45\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl9j2\" (UniqueName: \"kubernetes.io/projected/e1ec7bae-8e15-4844-84d2-ff5951d0be31-kube-api-access-hl9j2\") pod \"keystone-operator-controller-manager-768b96df4c-5xvtc\" (UID: \"e1ec7bae-8e15-4844-84d2-ff5951d0be31\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwpvg\" (UniqueName: \"kubernetes.io/projected/b41dbd4a-33dd-4dca-9356-34c740e8063f-kube-api-access-jwpvg\") pod \"glance-operator-controller-manager-79df6bcc97-wb4pc\" (UID: \"b41dbd4a-33dd-4dca-9356-34c740e8063f\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698485 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brz8f\" (UniqueName: \"kubernetes.io/projected/ae690990-eeb1-4871-8c51-dd3b547e1193-kube-api-access-brz8f\") pod \"neutron-operator-controller-manager-767865f676-k4r2p\" (UID: \"ae690990-eeb1-4871-8c51-dd3b547e1193\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.705002 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.706951 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.734744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lkm5\" (UniqueName: \"kubernetes.io/projected/66d3bf3a-086c-4340-ba73-209f526fc33c-kube-api-access-5lkm5\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.736005 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-zpc92"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.746533 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwpvg\" (UniqueName: \"kubernetes.io/projected/b41dbd4a-33dd-4dca-9356-34c740e8063f-kube-api-access-jwpvg\") pod \"glance-operator-controller-manager-79df6bcc97-wb4pc\" (UID: \"b41dbd4a-33dd-4dca-9356-34c740e8063f\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.747240 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.767341 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.778912 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.785140 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.799867 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.803737 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js847\" (UniqueName: \"kubernetes.io/projected/3c86f76c-1617-45e9-9573-f6fd51803b45-kube-api-access-js847\") pod \"ironic-operator-controller-manager-6f787dddc9-fjjvl\" (UID: \"3c86f76c-1617-45e9-9573-f6fd51803b45\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.803792 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl9j2\" (UniqueName: \"kubernetes.io/projected/e1ec7bae-8e15-4844-84d2-ff5951d0be31-kube-api-access-hl9j2\") pod \"keystone-operator-controller-manager-768b96df4c-5xvtc\" (UID: \"e1ec7bae-8e15-4844-84d2-ff5951d0be31\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.803840 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brz8f\" (UniqueName: \"kubernetes.io/projected/ae690990-eeb1-4871-8c51-dd3b547e1193-kube-api-access-brz8f\") pod \"neutron-operator-controller-manager-767865f676-k4r2p\" (UID: \"ae690990-eeb1-4871-8c51-dd3b547e1193\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.804043 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f65c\" (UniqueName: \"kubernetes.io/projected/211c991a-9406-4360-aa7f-830be3aa55db-kube-api-access-7f65c\") pod \"manila-operator-controller-manager-55f864c847-zpc92\" (UID: \"211c991a-9406-4360-aa7f-830be3aa55db\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.804101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxrp2\" (UniqueName: \"kubernetes.io/projected/37675366-70a8-4e0b-b92b-f7055547d918-kube-api-access-bxrp2\") pod \"mariadb-operator-controller-manager-67ccfc9778-47sbc\" (UID: \"37675366-70a8-4e0b-b92b-f7055547d918\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.818569 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.821684 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.829313 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js847\" (UniqueName: \"kubernetes.io/projected/3c86f76c-1617-45e9-9573-f6fd51803b45-kube-api-access-js847\") pod \"ironic-operator-controller-manager-6f787dddc9-fjjvl\" (UID: \"3c86f76c-1617-45e9-9573-f6fd51803b45\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.829990 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxrp2\" (UniqueName: \"kubernetes.io/projected/37675366-70a8-4e0b-b92b-f7055547d918-kube-api-access-bxrp2\") pod \"mariadb-operator-controller-manager-67ccfc9778-47sbc\" (UID: \"37675366-70a8-4e0b-b92b-f7055547d918\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.831528 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-jn5mq" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.837564 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.838553 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brz8f\" (UniqueName: \"kubernetes.io/projected/ae690990-eeb1-4871-8c51-dd3b547e1193-kube-api-access-brz8f\") pod \"neutron-operator-controller-manager-767865f676-k4r2p\" (UID: \"ae690990-eeb1-4871-8c51-dd3b547e1193\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.850275 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f65c\" (UniqueName: \"kubernetes.io/projected/211c991a-9406-4360-aa7f-830be3aa55db-kube-api-access-7f65c\") pod \"manila-operator-controller-manager-55f864c847-zpc92\" (UID: \"211c991a-9406-4360-aa7f-830be3aa55db\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.855062 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.858599 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl9j2\" (UniqueName: \"kubernetes.io/projected/e1ec7bae-8e15-4844-84d2-ff5951d0be31-kube-api-access-hl9j2\") pod \"keystone-operator-controller-manager-768b96df4c-5xvtc\" (UID: \"e1ec7bae-8e15-4844-84d2-ff5951d0be31\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.865819 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.900029 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.901364 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.908166 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-9j75f" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.921034 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.922245 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.926593 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gd4mk" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.927029 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.953546 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.987889 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563754-8p782"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:05.991988 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:05.995691 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563754-8p782"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.001067 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.007318 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.009370 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.010178 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.011065 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhg5h\" (UniqueName: \"kubernetes.io/projected/c776af1e-ad54-40fe-9bed-a0a09ce0eea7-kube-api-access-mhg5h\") pod \"octavia-operator-controller-manager-5b9f45d989-pzjdt\" (UID: \"c776af1e-ad54-40fe-9bed-a0a09ce0eea7\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.011127 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhg2s\" (UniqueName: \"kubernetes.io/projected/e245908e-e35e-403c-93f6-48371904ae42-kube-api-access-nhg2s\") pod \"nova-operator-controller-manager-5d488d59fb-h6whs\" (UID: \"e245908e-e35e-403c-93f6-48371904ae42\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.013520 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-qvdrw" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.022282 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.030304 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.031251 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.044951 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.047467 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-nz525" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.082295 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.099372 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.100248 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.105944 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-5qm2h" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.117557 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.117621 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmkvs\" (UniqueName: \"kubernetes.io/projected/208b26f2-3c91-4966-9d01-8fe73e4a7d87-kube-api-access-xmkvs\") pod \"ovn-operator-controller-manager-884679f54-fgfk9\" (UID: \"208b26f2-3c91-4966-9d01-8fe73e4a7d87\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.117651 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrdr\" (UniqueName: \"kubernetes.io/projected/80822932-2943-4f81-9436-1553ed031359-kube-api-access-rvrdr\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.117674 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhg5h\" (UniqueName: \"kubernetes.io/projected/c776af1e-ad54-40fe-9bed-a0a09ce0eea7-kube-api-access-mhg5h\") pod \"octavia-operator-controller-manager-5b9f45d989-pzjdt\" (UID: \"c776af1e-ad54-40fe-9bed-a0a09ce0eea7\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.117697 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.117719 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhg2s\" (UniqueName: \"kubernetes.io/projected/e245908e-e35e-403c-93f6-48371904ae42-kube-api-access-nhg2s\") pod \"nova-operator-controller-manager-5d488d59fb-h6whs\" (UID: \"e245908e-e35e-403c-93f6-48371904ae42\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.118183 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.118242 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert podName:66d3bf3a-086c-4340-ba73-209f526fc33c nodeName:}" failed. No retries permitted until 2026-03-18 09:20:07.118229648 +0000 UTC m=+1073.692974488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert") pod "infra-operator-controller-manager-7b9c774f96-64c4x" (UID: "66d3bf3a-086c-4340-ba73-209f526fc33c") : secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.123956 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.153267 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.154090 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.157559 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-9pmgz" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.169901 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhg2s\" (UniqueName: \"kubernetes.io/projected/e245908e-e35e-403c-93f6-48371904ae42-kube-api-access-nhg2s\") pod \"nova-operator-controller-manager-5d488d59fb-h6whs\" (UID: \"e245908e-e35e-403c-93f6-48371904ae42\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.172156 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhg5h\" (UniqueName: \"kubernetes.io/projected/c776af1e-ad54-40fe-9bed-a0a09ce0eea7-kube-api-access-mhg5h\") pod \"octavia-operator-controller-manager-5b9f45d989-pzjdt\" (UID: \"c776af1e-ad54-40fe-9bed-a0a09ce0eea7\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.219413 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="913fd7d5-c271-4918-992c-95e6048faa85" path="/var/lib/kubelet/pods/913fd7d5-c271-4918-992c-95e6048faa85/volumes" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.220050 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.221265 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.221317 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkvs\" (UniqueName: \"kubernetes.io/projected/208b26f2-3c91-4966-9d01-8fe73e4a7d87-kube-api-access-xmkvs\") pod \"ovn-operator-controller-manager-884679f54-fgfk9\" (UID: \"208b26f2-3c91-4966-9d01-8fe73e4a7d87\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.221347 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrdr\" (UniqueName: \"kubernetes.io/projected/80822932-2943-4f81-9436-1553ed031359-kube-api-access-rvrdr\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.221369 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qfp8\" (UniqueName: \"kubernetes.io/projected/8ccabb3b-da59-4ab0-89c8-99094a939f0d-kube-api-access-7qfp8\") pod \"swift-operator-controller-manager-c674c5965-c6l5k\" (UID: \"8ccabb3b-da59-4ab0-89c8-99094a939f0d\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.221407 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hglbk\" (UniqueName: \"kubernetes.io/projected/2f8e8860-00a1-43fc-9776-c617f270cc50-kube-api-access-hglbk\") pod \"placement-operator-controller-manager-5784578c99-d5w9q\" (UID: \"2f8e8860-00a1-43fc-9776-c617f270cc50\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.221523 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.221559 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert podName:80822932-2943-4f81-9436-1553ed031359 nodeName:}" failed. No retries permitted until 2026-03-18 09:20:06.721546574 +0000 UTC m=+1073.296291414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" (UID: "80822932-2943-4f81-9436-1553ed031359") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.245218 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmkvs\" (UniqueName: \"kubernetes.io/projected/208b26f2-3c91-4966-9d01-8fe73e4a7d87-kube-api-access-xmkvs\") pod \"ovn-operator-controller-manager-884679f54-fgfk9\" (UID: \"208b26f2-3c91-4966-9d01-8fe73e4a7d87\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.278931 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.282264 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.320582 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrdr\" (UniqueName: \"kubernetes.io/projected/80822932-2943-4f81-9436-1553ed031359-kube-api-access-rvrdr\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.371067 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qfp8\" (UniqueName: \"kubernetes.io/projected/8ccabb3b-da59-4ab0-89c8-99094a939f0d-kube-api-access-7qfp8\") pod \"swift-operator-controller-manager-c674c5965-c6l5k\" (UID: \"8ccabb3b-da59-4ab0-89c8-99094a939f0d\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.371513 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hglbk\" (UniqueName: \"kubernetes.io/projected/2f8e8860-00a1-43fc-9776-c617f270cc50-kube-api-access-hglbk\") pod \"placement-operator-controller-manager-5784578c99-d5w9q\" (UID: \"2f8e8860-00a1-43fc-9776-c617f270cc50\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.372446 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8vhl\" (UniqueName: \"kubernetes.io/projected/9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77-kube-api-access-n8vhl\") pod \"telemetry-operator-controller-manager-d6b694c5-tx9zq\" (UID: \"9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.385033 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.388969 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-2d5qf" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.392798 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.411837 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hglbk\" (UniqueName: \"kubernetes.io/projected/2f8e8860-00a1-43fc-9776-c617f270cc50-kube-api-access-hglbk\") pod \"placement-operator-controller-manager-5784578c99-d5w9q\" (UID: \"2f8e8860-00a1-43fc-9776-c617f270cc50\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.423064 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qfp8\" (UniqueName: \"kubernetes.io/projected/8ccabb3b-da59-4ab0-89c8-99094a939f0d-kube-api-access-7qfp8\") pod \"swift-operator-controller-manager-c674c5965-c6l5k\" (UID: \"8ccabb3b-da59-4ab0-89c8-99094a939f0d\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.447685 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.449544 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.456638 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.460988 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.473833 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8vhl\" (UniqueName: \"kubernetes.io/projected/9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77-kube-api-access-n8vhl\") pod \"telemetry-operator-controller-manager-d6b694c5-tx9zq\" (UID: \"9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.475387 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.481845 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-nfjkg" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.500584 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8vhl\" (UniqueName: \"kubernetes.io/projected/9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77-kube-api-access-n8vhl\") pod \"telemetry-operator-controller-manager-d6b694c5-tx9zq\" (UID: \"9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.506055 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.507050 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.509391 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-26qtx" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.511285 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.511400 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.523862 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.534345 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.535257 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.538673 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-w2tqv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.564142 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.577652 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9wb2\" (UniqueName: \"kubernetes.io/projected/57277339-c9be-4de1-8e35-72ae98d33905-kube-api-access-v9wb2\") pod \"watcher-operator-controller-manager-6c4d75f7f9-sgs49\" (UID: \"57277339-c9be-4de1-8e35-72ae98d33905\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.578135 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v86w\" (UniqueName: \"kubernetes.io/projected/99adb6be-2a3e-4148-8074-9258222ebd60-kube-api-access-9v86w\") pod \"test-operator-controller-manager-54c5f5bc8-jsm76\" (UID: \"99adb6be-2a3e-4148-8074-9258222ebd60\") " pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.581770 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.583642 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" event={"ID":"0526f654-9ddc-4495-bb04-be13e53b6a1b","Type":"ContainerStarted","Data":"eb8d809482f9c01c7a6535ee3ca7de1c4aa288bf8b0d49f0ffaf77bde9f11ffc"} Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.662881 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.680603 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hljh\" (UniqueName: \"kubernetes.io/projected/3c7e3158-5139-467d-b33c-808747f0d9be-kube-api-access-7hljh\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.680647 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8qp9\" (UniqueName: \"kubernetes.io/projected/b837636e-8c09-42b7-9a81-e7875df68344-kube-api-access-g8qp9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5jrv8\" (UID: \"b837636e-8c09-42b7-9a81-e7875df68344\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.680914 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9wb2\" (UniqueName: \"kubernetes.io/projected/57277339-c9be-4de1-8e35-72ae98d33905-kube-api-access-v9wb2\") pod \"watcher-operator-controller-manager-6c4d75f7f9-sgs49\" (UID: \"57277339-c9be-4de1-8e35-72ae98d33905\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.680934 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v86w\" (UniqueName: \"kubernetes.io/projected/99adb6be-2a3e-4148-8074-9258222ebd60-kube-api-access-9v86w\") pod \"test-operator-controller-manager-54c5f5bc8-jsm76\" (UID: \"99adb6be-2a3e-4148-8074-9258222ebd60\") " pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.680954 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.680980 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.700623 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v86w\" (UniqueName: \"kubernetes.io/projected/99adb6be-2a3e-4148-8074-9258222ebd60-kube-api-access-9v86w\") pod \"test-operator-controller-manager-54c5f5bc8-jsm76\" (UID: \"99adb6be-2a3e-4148-8074-9258222ebd60\") " pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.702468 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9wb2\" (UniqueName: \"kubernetes.io/projected/57277339-c9be-4de1-8e35-72ae98d33905-kube-api-access-v9wb2\") pod \"watcher-operator-controller-manager-6c4d75f7f9-sgs49\" (UID: \"57277339-c9be-4de1-8e35-72ae98d33905\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.711897 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.741077 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.781654 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.782240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.782508 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hljh\" (UniqueName: \"kubernetes.io/projected/3c7e3158-5139-467d-b33c-808747f0d9be-kube-api-access-7hljh\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.782542 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8qp9\" (UniqueName: \"kubernetes.io/projected/b837636e-8c09-42b7-9a81-e7875df68344-kube-api-access-g8qp9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5jrv8\" (UID: \"b837636e-8c09-42b7-9a81-e7875df68344\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.782620 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.782782 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.782971 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.783041 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:07.283018328 +0000 UTC m=+1073.857763168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.783438 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.783512 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:07.28348592 +0000 UTC m=+1073.858230770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "metrics-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.783688 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.783760 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert podName:80822932-2943-4f81-9436-1553ed031359 nodeName:}" failed. No retries permitted until 2026-03-18 09:20:07.783736877 +0000 UTC m=+1074.358481897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" (UID: "80822932-2943-4f81-9436-1553ed031359") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.811722 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8qp9\" (UniqueName: \"kubernetes.io/projected/b837636e-8c09-42b7-9a81-e7875df68344-kube-api-access-g8qp9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5jrv8\" (UID: \"b837636e-8c09-42b7-9a81-e7875df68344\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.811981 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hljh\" (UniqueName: \"kubernetes.io/projected/3c7e3158-5139-467d-b33c-808747f0d9be-kube-api-access-7hljh\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.824223 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.926578 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.034294 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt"] Mar 18 09:20:07 crc kubenswrapper[4778]: W0318 09:20:07.048336 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3390909b_6271_40dd_9662_0710f6866143.slice/crio-086c9a7df1f7230c0b32e2266231d5ffe8dcd8589f29376e627e50b57e82371b WatchSource:0}: Error finding container 086c9a7df1f7230c0b32e2266231d5ffe8dcd8589f29376e627e50b57e82371b: Status 404 returned error can't find the container with id 086c9a7df1f7230c0b32e2266231d5ffe8dcd8589f29376e627e50b57e82371b Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.137438 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.146768 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w"] Mar 18 09:20:07 crc kubenswrapper[4778]: W0318 09:20:07.150781 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae690990_eeb1_4871_8c51_dd3b547e1193.slice/crio-b58a71c985266ccba8ccd3164eb13041ddf54915bc6689a49db9964daf1899b2 WatchSource:0}: Error finding container b58a71c985266ccba8ccd3164eb13041ddf54915bc6689a49db9964daf1899b2: Status 404 returned error can't find the container with id b58a71c985266ccba8ccd3164eb13041ddf54915bc6689a49db9964daf1899b2 Mar 18 09:20:07 crc kubenswrapper[4778]: W0318 09:20:07.152364 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaceb2f7b_585f_451a_83b8_e673965ada87.slice/crio-f80fdaa618cf362fa439c3bc2438737726c2fbec6e449966c37f37aac4ddbbd5 WatchSource:0}: Error finding container f80fdaa618cf362fa439c3bc2438737726c2fbec6e449966c37f37aac4ddbbd5: Status 404 returned error can't find the container with id f80fdaa618cf362fa439c3bc2438737726c2fbec6e449966c37f37aac4ddbbd5 Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.169614 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.194096 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.194434 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.194792 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert podName:66d3bf3a-086c-4340-ba73-209f526fc33c nodeName:}" failed. No retries permitted until 2026-03-18 09:20:09.194771557 +0000 UTC m=+1075.769516397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert") pod "infra-operator-controller-manager-7b9c774f96-64c4x" (UID: "66d3bf3a-086c-4340-ba73-209f526fc33c") : secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.295329 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.295387 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.295553 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.295552 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.295610 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:08.295593404 +0000 UTC m=+1074.870338244 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "metrics-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.295638 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:08.295620835 +0000 UTC m=+1074.870365675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "webhook-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.564477 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.583102 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.606383 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.644221 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.645581 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" event={"ID":"3390909b-6271-40dd-9662-0710f6866143","Type":"ContainerStarted","Data":"086c9a7df1f7230c0b32e2266231d5ffe8dcd8589f29376e627e50b57e82371b"} Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.666372 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" event={"ID":"710ababb-0bee-441d-8dd0-e6a72ea2b2e3","Type":"ContainerStarted","Data":"701f288d7559807f8c005c2e53b1c0bc2db223255c618f71e329c0fb88c12c9f"} Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.677452 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" event={"ID":"ae690990-eeb1-4871-8c51-dd3b547e1193","Type":"ContainerStarted","Data":"b58a71c985266ccba8ccd3164eb13041ddf54915bc6689a49db9964daf1899b2"} Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.683599 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-zpc92"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.690743 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.691055 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" event={"ID":"aceb2f7b-585f-451a-83b8-e673965ada87","Type":"ContainerStarted","Data":"f80fdaa618cf362fa439c3bc2438737726c2fbec6e449966c37f37aac4ddbbd5"} Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.692229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" event={"ID":"208b26f2-3c91-4966-9d01-8fe73e4a7d87","Type":"ContainerStarted","Data":"c03f94117048ef2701ab7fc3d7574fab74dac0b36c93a529221c3fad573c76ae"} Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.703226 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.816790 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.817075 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.817127 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert podName:80822932-2943-4f81-9436-1553ed031359 nodeName:}" failed. No retries permitted until 2026-03-18 09:20:09.817110705 +0000 UTC m=+1076.391855545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" (UID: "80822932-2943-4f81-9436-1553ed031359") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.033293 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q"] Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.049558 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8"] Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.056067 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt"] Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.105577 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76"] Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.113553 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.94:5001/openstack-k8s-operators/test-operator:9967a65233f8c87751fea24bb23667f563a71e91,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9v86w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-54c5f5bc8-jsm76_openstack-operators(99adb6be-2a3e-4148-8074-9258222ebd60): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.114942 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" podUID="99adb6be-2a3e-4148-8074-9258222ebd60" Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.120129 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49"] Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.127875 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k"] Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.136046 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq"] Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.137115 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7qfp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-c6l5k_openstack-operators(8ccabb3b-da59-4ab0-89c8-99094a939f0d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.138242 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" podUID="8ccabb3b-da59-4ab0-89c8-99094a939f0d" Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.171070 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n8vhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-tx9zq_openstack-operators(9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.172668 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" podUID="9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77" Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.298127 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs"] Mar 18 09:20:08 crc kubenswrapper[4778]: W0318 09:20:08.306899 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode245908e_e35e_403c_93f6_48371904ae42.slice/crio-01d8b950e77b0b6197f6cd86545e3fa36bc1bd06f7bf5183be587ebc228ab213 WatchSource:0}: Error finding container 01d8b950e77b0b6197f6cd86545e3fa36bc1bd06f7bf5183be587ebc228ab213: Status 404 returned error can't find the container with id 01d8b950e77b0b6197f6cd86545e3fa36bc1bd06f7bf5183be587ebc228ab213 Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.364294 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.364441 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.364463 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.364535 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:10.364514903 +0000 UTC m=+1076.939259743 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "webhook-server-cert" not found Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.364613 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.364696 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:10.364677918 +0000 UTC m=+1076.939422838 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "metrics-server-cert" not found Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.734025 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" event={"ID":"9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77","Type":"ContainerStarted","Data":"795c63e5cf86fe706d352e5445e86b98a2b4c82f8f2ea1e6362fcb144e9e73b1"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.739928 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" event={"ID":"57277339-c9be-4de1-8e35-72ae98d33905","Type":"ContainerStarted","Data":"51c0c3efe70fa092841fcb028a10ddaf784e0b252d86815df04fbbbf37aa2629"} Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.740061 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" podUID="9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77" Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.747474 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" event={"ID":"211c991a-9406-4360-aa7f-830be3aa55db","Type":"ContainerStarted","Data":"01d059d0646cde57bf5523de3e51f528caaa9b9bc794c54a4ca930e45012c328"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.763292 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" event={"ID":"99adb6be-2a3e-4148-8074-9258222ebd60","Type":"ContainerStarted","Data":"fc35a36a6691c170ce0569d7b5df285f4f48fea713f2d75bf7afe3371adac118"} Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.768582 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.94:5001/openstack-k8s-operators/test-operator:9967a65233f8c87751fea24bb23667f563a71e91\\\"\"" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" podUID="99adb6be-2a3e-4148-8074-9258222ebd60" Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.801409 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" event={"ID":"e1ec7bae-8e15-4844-84d2-ff5951d0be31","Type":"ContainerStarted","Data":"fc19af69754f5f4ca9176ab7c8b953e0ef52b3a725ae68203cbb3f2c37e5c075"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.813619 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" event={"ID":"8ccabb3b-da59-4ab0-89c8-99094a939f0d","Type":"ContainerStarted","Data":"1c88c87f30c9adb675e35054d8a0ccb0acf8edfdc742e0e34ca13af1fdc0a714"} Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.817805 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" podUID="8ccabb3b-da59-4ab0-89c8-99094a939f0d" Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.831683 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" event={"ID":"e245908e-e35e-403c-93f6-48371904ae42","Type":"ContainerStarted","Data":"01d8b950e77b0b6197f6cd86545e3fa36bc1bd06f7bf5183be587ebc228ab213"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.863367 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" event={"ID":"37675366-70a8-4e0b-b92b-f7055547d918","Type":"ContainerStarted","Data":"5855588df5d3c9957d6cdf1268c1f0e433a65c63dc5c72a9001b1e42ffdc55aa"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.884326 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" event={"ID":"3c86f76c-1617-45e9-9573-f6fd51803b45","Type":"ContainerStarted","Data":"d0e1e9d9222e9ad294c135a60aa2d7652c89b85238929a2ae3e6aff5506a4e6c"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.887056 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" event={"ID":"124dc549-cb2a-4b1c-a610-093cf9b8c05d","Type":"ContainerStarted","Data":"f486ef9a0d7196edfde13693bef43f9bccd606f873a790f26e62a45e9993c955"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.889599 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" event={"ID":"2f8e8860-00a1-43fc-9776-c617f270cc50","Type":"ContainerStarted","Data":"6ec80f52a1552e15b2b0561d37e4314c55f4023073d7e65b7c70b2200590abb6"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.891636 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" event={"ID":"c776af1e-ad54-40fe-9bed-a0a09ce0eea7","Type":"ContainerStarted","Data":"fec3675798b6b01b76c3f155f0804049d1c805f3939460649722db5890f0911d"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.902034 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" event={"ID":"b41dbd4a-33dd-4dca-9356-34c740e8063f","Type":"ContainerStarted","Data":"80c11d26efe778bd105fd3254ee2e04c03127cb3a8e73e18271fa73d7b5e6dac"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.910490 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" event={"ID":"b837636e-8c09-42b7-9a81-e7875df68344","Type":"ContainerStarted","Data":"4bea5c9de09771619869377e6c319249ef6faa2b9bb923b1dd81aeed42b5484a"} Mar 18 09:20:09 crc kubenswrapper[4778]: I0318 09:20:09.280500 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.280707 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.280786 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert podName:66d3bf3a-086c-4340-ba73-209f526fc33c nodeName:}" failed. No retries permitted until 2026-03-18 09:20:13.280767258 +0000 UTC m=+1079.855512098 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert") pod "infra-operator-controller-manager-7b9c774f96-64c4x" (UID: "66d3bf3a-086c-4340-ba73-209f526fc33c") : secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:09 crc kubenswrapper[4778]: I0318 09:20:09.894510 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.894877 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.895049 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert podName:80822932-2943-4f81-9436-1553ed031359 nodeName:}" failed. No retries permitted until 2026-03-18 09:20:13.894968214 +0000 UTC m=+1080.469713054 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" (UID: "80822932-2943-4f81-9436-1553ed031359") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.946325 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" podUID="8ccabb3b-da59-4ab0-89c8-99094a939f0d" Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.946436 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.94:5001/openstack-k8s-operators/test-operator:9967a65233f8c87751fea24bb23667f563a71e91\\\"\"" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" podUID="99adb6be-2a3e-4148-8074-9258222ebd60" Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.946491 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" podUID="9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77" Mar 18 09:20:10 crc kubenswrapper[4778]: I0318 09:20:10.402691 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:10 crc kubenswrapper[4778]: I0318 09:20:10.402750 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:10 crc kubenswrapper[4778]: E0318 09:20:10.402862 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 09:20:10 crc kubenswrapper[4778]: E0318 09:20:10.402914 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:14.402899964 +0000 UTC m=+1080.977644804 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "metrics-server-cert" not found Mar 18 09:20:10 crc kubenswrapper[4778]: E0318 09:20:10.402919 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 09:20:10 crc kubenswrapper[4778]: E0318 09:20:10.402993 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:14.402976366 +0000 UTC m=+1080.977721206 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "webhook-server-cert" not found Mar 18 09:20:13 crc kubenswrapper[4778]: I0318 09:20:13.355178 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:13 crc kubenswrapper[4778]: E0318 09:20:13.355977 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:13 crc kubenswrapper[4778]: E0318 09:20:13.357701 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert podName:66d3bf3a-086c-4340-ba73-209f526fc33c nodeName:}" failed. No retries permitted until 2026-03-18 09:20:21.357666562 +0000 UTC m=+1087.932411402 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert") pod "infra-operator-controller-manager-7b9c774f96-64c4x" (UID: "66d3bf3a-086c-4340-ba73-209f526fc33c") : secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:13 crc kubenswrapper[4778]: I0318 09:20:13.966895 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:13 crc kubenswrapper[4778]: E0318 09:20:13.967011 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:13 crc kubenswrapper[4778]: E0318 09:20:13.967067 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert podName:80822932-2943-4f81-9436-1553ed031359 nodeName:}" failed. No retries permitted until 2026-03-18 09:20:21.967051116 +0000 UTC m=+1088.541795956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" (UID: "80822932-2943-4f81-9436-1553ed031359") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:14 crc kubenswrapper[4778]: I0318 09:20:14.493405 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:14 crc kubenswrapper[4778]: E0318 09:20:14.493635 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 09:20:14 crc kubenswrapper[4778]: E0318 09:20:14.494185 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:22.494152389 +0000 UTC m=+1089.068897439 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "webhook-server-cert" not found Mar 18 09:20:14 crc kubenswrapper[4778]: E0318 09:20:14.494328 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 09:20:14 crc kubenswrapper[4778]: E0318 09:20:14.494434 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:22.494411166 +0000 UTC m=+1089.069156006 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "metrics-server-cert" not found Mar 18 09:20:14 crc kubenswrapper[4778]: I0318 09:20:14.494070 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:19 crc kubenswrapper[4778]: I0318 09:20:19.990948 4778 scope.go:117] "RemoveContainer" containerID="623e10ff390eb7e19703ecca951ddfb9b57c997906e5eae908a2c7aedc17d0d1" Mar 18 09:20:20 crc kubenswrapper[4778]: E0318 09:20:20.569978 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8" Mar 18 09:20:20 crc kubenswrapper[4778]: E0318 09:20:20.570521 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-js847,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6f787dddc9-fjjvl_openstack-operators(3c86f76c-1617-45e9-9573-f6fd51803b45): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:20:20 crc kubenswrapper[4778]: E0318 09:20:20.571773 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" podUID="3c86f76c-1617-45e9-9573-f6fd51803b45" Mar 18 09:20:21 crc kubenswrapper[4778]: E0318 09:20:21.059054 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" podUID="3c86f76c-1617-45e9-9573-f6fd51803b45" Mar 18 09:20:21 crc kubenswrapper[4778]: E0318 09:20:21.236441 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a" Mar 18 09:20:21 crc kubenswrapper[4778]: E0318 09:20:21.236652 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-brz8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-k4r2p_openstack-operators(ae690990-eeb1-4871-8c51-dd3b547e1193): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:20:21 crc kubenswrapper[4778]: E0318 09:20:21.238691 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" podUID="ae690990-eeb1-4871-8c51-dd3b547e1193" Mar 18 09:20:21 crc kubenswrapper[4778]: I0318 09:20:21.412349 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:21 crc kubenswrapper[4778]: I0318 09:20:21.418087 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:21 crc kubenswrapper[4778]: I0318 09:20:21.710957 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dpc7h" Mar 18 09:20:21 crc kubenswrapper[4778]: I0318 09:20:21.719889 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.021266 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.026143 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:22 crc kubenswrapper[4778]: E0318 09:20:22.065488 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" podUID="ae690990-eeb1-4871-8c51-dd3b547e1193" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.117546 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gd4mk" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.125942 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:22 crc kubenswrapper[4778]: E0318 09:20:22.388010 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 18 09:20:22 crc kubenswrapper[4778]: E0318 09:20:22.388278 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nhg2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-h6whs_openstack-operators(e245908e-e35e-403c-93f6-48371904ae42): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:20:22 crc kubenswrapper[4778]: E0318 09:20:22.390514 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" podUID="e245908e-e35e-403c-93f6-48371904ae42" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.529459 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.529813 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.534667 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.537881 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.760138 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-26qtx" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.769400 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:23 crc kubenswrapper[4778]: E0318 09:20:23.074437 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" podUID="e245908e-e35e-403c-93f6-48371904ae42" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.064906 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.065178 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hl9j2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-5xvtc_openstack-operators(e1ec7bae-8e15-4844-84d2-ff5951d0be31): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.066406 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" podUID="e1ec7bae-8e15-4844-84d2-ff5951d0be31" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.079372 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" podUID="e1ec7bae-8e15-4844-84d2-ff5951d0be31" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.468508 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.468722 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g8qp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-5jrv8_openstack-operators(b837636e-8c09-42b7-9a81-e7875df68344): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.469975 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" podUID="b837636e-8c09-42b7-9a81-e7875df68344" Mar 18 09:20:25 crc kubenswrapper[4778]: E0318 09:20:25.087337 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" podUID="b837636e-8c09-42b7-9a81-e7875df68344" Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.106265 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" event={"ID":"0526f654-9ddc-4495-bb04-be13e53b6a1b","Type":"ContainerStarted","Data":"10bf04d12f3856f94bc165a7fd6d46930e83ea30fa0da8c1e312fe336a1fe54b"} Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.107488 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.111742 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" event={"ID":"211c991a-9406-4360-aa7f-830be3aa55db","Type":"ContainerStarted","Data":"bf98bbdf61a4bc691bfbbddd3b880e6bd38fb98138fdb140997663f96b0910b3"} Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.112027 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.125284 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" podStartSLOduration=4.25927371 podStartE2EDuration="21.125268154s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:06.564279955 +0000 UTC m=+1073.139024795" lastFinishedPulling="2026-03-18 09:20:23.430274389 +0000 UTC m=+1090.005019239" observedRunningTime="2026-03-18 09:20:26.1243838 +0000 UTC m=+1092.699128640" watchObservedRunningTime="2026-03-18 09:20:26.125268154 +0000 UTC m=+1092.700012994" Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.149367 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" podStartSLOduration=5.374169649 podStartE2EDuration="21.149340603s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.655069604 +0000 UTC m=+1074.229814444" lastFinishedPulling="2026-03-18 09:20:23.430240558 +0000 UTC m=+1090.004985398" observedRunningTime="2026-03-18 09:20:26.142025813 +0000 UTC m=+1092.716770653" watchObservedRunningTime="2026-03-18 09:20:26.149340603 +0000 UTC m=+1092.724085463" Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.198582 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv"] Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.204731 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x"] Mar 18 09:20:26 crc kubenswrapper[4778]: W0318 09:20:26.221251 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66d3bf3a_086c_4340_ba73_209f526fc33c.slice/crio-cb6071b57d3c52aa80cda85c852323bf1b1a367f19786fc37b5f27315ce3b67c WatchSource:0}: Error finding container cb6071b57d3c52aa80cda85c852323bf1b1a367f19786fc37b5f27315ce3b67c: Status 404 returned error can't find the container with id cb6071b57d3c52aa80cda85c852323bf1b1a367f19786fc37b5f27315ce3b67c Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.315389 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr"] Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.131056 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" event={"ID":"208b26f2-3c91-4966-9d01-8fe73e4a7d87","Type":"ContainerStarted","Data":"7a45d06efb24496c3b891685b78f23ada48de52d50c828ce049b9e7165130516"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.132726 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.140007 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" event={"ID":"66d3bf3a-086c-4340-ba73-209f526fc33c","Type":"ContainerStarted","Data":"cb6071b57d3c52aa80cda85c852323bf1b1a367f19786fc37b5f27315ce3b67c"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.156076 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" event={"ID":"8ccabb3b-da59-4ab0-89c8-99094a939f0d","Type":"ContainerStarted","Data":"678ceaef4518db718dfe9de9d0268dda8a53fdd582caeb325e04a0a58249c4fb"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.157362 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.171290 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" event={"ID":"99adb6be-2a3e-4148-8074-9258222ebd60","Type":"ContainerStarted","Data":"42833b94d7023c6cf9fde2bddf9b46176c05e8b6acb5cf31fc6ccbde134e302d"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.171804 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" podStartSLOduration=5.317619714 podStartE2EDuration="22.171770772s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.601369946 +0000 UTC m=+1074.176114776" lastFinishedPulling="2026-03-18 09:20:24.455520994 +0000 UTC m=+1091.030265834" observedRunningTime="2026-03-18 09:20:27.160649608 +0000 UTC m=+1093.735394448" watchObservedRunningTime="2026-03-18 09:20:27.171770772 +0000 UTC m=+1093.746515622" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.171924 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.191477 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" event={"ID":"2f8e8860-00a1-43fc-9776-c617f270cc50","Type":"ContainerStarted","Data":"f79ab034e094322537d45810b2420051cb2dde668bd2d4935e9dc51eb4f1d22f"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.192002 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.208861 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" podStartSLOduration=4.645056499 podStartE2EDuration="22.208840865s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.137003152 +0000 UTC m=+1074.711747992" lastFinishedPulling="2026-03-18 09:20:25.700787488 +0000 UTC m=+1092.275532358" observedRunningTime="2026-03-18 09:20:27.195314625 +0000 UTC m=+1093.770059495" watchObservedRunningTime="2026-03-18 09:20:27.208840865 +0000 UTC m=+1093.783585705" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.231997 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" podStartSLOduration=5.839454491 podStartE2EDuration="22.231970307s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.078556354 +0000 UTC m=+1074.653301184" lastFinishedPulling="2026-03-18 09:20:24.47107213 +0000 UTC m=+1091.045817000" observedRunningTime="2026-03-18 09:20:27.229566132 +0000 UTC m=+1093.804310982" watchObservedRunningTime="2026-03-18 09:20:27.231970307 +0000 UTC m=+1093.806715157" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.234253 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" event={"ID":"c776af1e-ad54-40fe-9bed-a0a09ce0eea7","Type":"ContainerStarted","Data":"c03f2e759ed3cbf30fc2fa8a252e5497da1d91b431f59aa059d1958afe02ef1a"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.234922 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.246921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" event={"ID":"124dc549-cb2a-4b1c-a610-093cf9b8c05d","Type":"ContainerStarted","Data":"7c873f952c4bff30d1419439579f8ff651e5b4f9ff778ec6e1ae794557fe1526"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.247876 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.252232 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" event={"ID":"37675366-70a8-4e0b-b92b-f7055547d918","Type":"ContainerStarted","Data":"27fa6dfa89e1761c5b611170d6c156f6819bf15f781bd8c4a987cb3a232b83b0"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.252949 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.265279 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" event={"ID":"57277339-c9be-4de1-8e35-72ae98d33905","Type":"ContainerStarted","Data":"1a7359941866285e9344a5e7853a9371e2f69381f723ff5130bbfff9e657184d"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.265317 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.270933 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" podStartSLOduration=4.417172707 podStartE2EDuration="22.270916512s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.113395636 +0000 UTC m=+1074.688140476" lastFinishedPulling="2026-03-18 09:20:25.967139441 +0000 UTC m=+1092.541884281" observedRunningTime="2026-03-18 09:20:27.27007373 +0000 UTC m=+1093.844818580" watchObservedRunningTime="2026-03-18 09:20:27.270916512 +0000 UTC m=+1093.845661352" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.273634 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" event={"ID":"3c7e3158-5139-467d-b33c-808747f0d9be","Type":"ContainerStarted","Data":"9b2f692199c8fce2bd61b2d59ec71db9b2cf3dda9463e173099bb31fd1f733e1"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.273670 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" event={"ID":"3c7e3158-5139-467d-b33c-808747f0d9be","Type":"ContainerStarted","Data":"055f1a64f06bc32c86f16846259baa73a25844ecbe122df218cea6767810641f"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.275403 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.285345 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" event={"ID":"710ababb-0bee-441d-8dd0-e6a72ea2b2e3","Type":"ContainerStarted","Data":"66b7dc36df17775e00bc5d8587cda6ec4a0e1a6948129872c69dcc4efdf2c3d5"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.285731 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.287150 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" event={"ID":"80822932-2943-4f81-9436-1553ed031359","Type":"ContainerStarted","Data":"5fb40edc8c76a7399e800953259f175cf2429abe682dff519cc6ca4906398e7b"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.302520 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" event={"ID":"b41dbd4a-33dd-4dca-9356-34c740e8063f","Type":"ContainerStarted","Data":"ec124d7e4c28ea2b93a76b9518ca4879cc506d3a3dd8e671b0b2b0dda809e9bf"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.302753 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.315035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" event={"ID":"3390909b-6271-40dd-9662-0710f6866143","Type":"ContainerStarted","Data":"4933e2c29e48f2445c253a18602ca70a7c13376e97dd6eb9fa8b1c5059569f9e"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.315674 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.323136 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" podStartSLOduration=5.955624559 podStartE2EDuration="22.32310058s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.089694409 +0000 UTC m=+1074.664439259" lastFinishedPulling="2026-03-18 09:20:24.45717041 +0000 UTC m=+1091.031915280" observedRunningTime="2026-03-18 09:20:27.307910574 +0000 UTC m=+1093.882655414" watchObservedRunningTime="2026-03-18 09:20:27.32310058 +0000 UTC m=+1093.897845430" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.337184 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" event={"ID":"aceb2f7b-585f-451a-83b8-e673965ada87","Type":"ContainerStarted","Data":"9fd491aa6d7d1a94922fc9bacf35404b80a5b1f87c84d423d42b5e68bfe9d905"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.337241 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.355064 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" podStartSLOduration=5.574947249 podStartE2EDuration="22.355047973s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.675646256 +0000 UTC m=+1074.250391096" lastFinishedPulling="2026-03-18 09:20:24.45574695 +0000 UTC m=+1091.030491820" observedRunningTime="2026-03-18 09:20:27.350656453 +0000 UTC m=+1093.925401293" watchObservedRunningTime="2026-03-18 09:20:27.355047973 +0000 UTC m=+1093.929792813" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.386460 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" podStartSLOduration=5.553530612 podStartE2EDuration="22.386445471s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.622334448 +0000 UTC m=+1074.197079288" lastFinishedPulling="2026-03-18 09:20:24.455249307 +0000 UTC m=+1091.029994147" observedRunningTime="2026-03-18 09:20:27.381877757 +0000 UTC m=+1093.956622597" watchObservedRunningTime="2026-03-18 09:20:27.386445471 +0000 UTC m=+1093.961190311" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.417110 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" podStartSLOduration=6.011904237 podStartE2EDuration="22.417091039s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.101719397 +0000 UTC m=+1074.676464237" lastFinishedPulling="2026-03-18 09:20:24.506906209 +0000 UTC m=+1091.081651039" observedRunningTime="2026-03-18 09:20:27.413232224 +0000 UTC m=+1093.987977074" watchObservedRunningTime="2026-03-18 09:20:27.417091039 +0000 UTC m=+1093.991835879" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.441557 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" podStartSLOduration=5.146831552 podStartE2EDuration="22.441525668s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.160663324 +0000 UTC m=+1073.735408164" lastFinishedPulling="2026-03-18 09:20:24.45535743 +0000 UTC m=+1091.030102280" observedRunningTime="2026-03-18 09:20:27.438405232 +0000 UTC m=+1094.013150102" watchObservedRunningTime="2026-03-18 09:20:27.441525668 +0000 UTC m=+1094.016270508" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.471567 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" podStartSLOduration=6.106378612 podStartE2EDuration="22.471547039s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.67576792 +0000 UTC m=+1074.250512760" lastFinishedPulling="2026-03-18 09:20:24.040936347 +0000 UTC m=+1090.615681187" observedRunningTime="2026-03-18 09:20:27.467525078 +0000 UTC m=+1094.042269928" watchObservedRunningTime="2026-03-18 09:20:27.471547039 +0000 UTC m=+1094.046291879" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.512132 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" podStartSLOduration=5.258149227 podStartE2EDuration="22.512111108s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.202671843 +0000 UTC m=+1073.777416683" lastFinishedPulling="2026-03-18 09:20:24.456633724 +0000 UTC m=+1091.031378564" observedRunningTime="2026-03-18 09:20:27.499335149 +0000 UTC m=+1094.074079999" watchObservedRunningTime="2026-03-18 09:20:27.512111108 +0000 UTC m=+1094.086855948" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.535513 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" podStartSLOduration=5.143072519 podStartE2EDuration="22.535492097s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.063258361 +0000 UTC m=+1073.638003201" lastFinishedPulling="2026-03-18 09:20:24.455677939 +0000 UTC m=+1091.030422779" observedRunningTime="2026-03-18 09:20:27.532457144 +0000 UTC m=+1094.107201994" watchObservedRunningTime="2026-03-18 09:20:27.535492097 +0000 UTC m=+1094.110236937" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.578002 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" podStartSLOduration=21.577986969 podStartE2EDuration="21.577986969s" podCreationTimestamp="2026-03-18 09:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:20:27.572960462 +0000 UTC m=+1094.147705312" watchObservedRunningTime="2026-03-18 09:20:27.577986969 +0000 UTC m=+1094.152731809" Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.374719 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" event={"ID":"9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77","Type":"ContainerStarted","Data":"dce6359d66344d37b58bb6fff39a43da4941a5d8668915d38bbb48dcbe686103"} Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.375558 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.376698 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" event={"ID":"66d3bf3a-086c-4340-ba73-209f526fc33c","Type":"ContainerStarted","Data":"3242e3b292a8b48b95f430fe27b293a4ff6d9ed3fa810741531fceee5082a9b4"} Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.376850 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.378880 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" event={"ID":"80822932-2943-4f81-9436-1553ed031359","Type":"ContainerStarted","Data":"6f8410dee7c497f7a0937afe00f1c0314e7925466d7664ef8e7c6c42c9a1005f"} Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.379135 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.397990 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" podStartSLOduration=3.674537561 podStartE2EDuration="26.397947786s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.170654873 +0000 UTC m=+1074.745399713" lastFinishedPulling="2026-03-18 09:20:30.894065098 +0000 UTC m=+1097.468809938" observedRunningTime="2026-03-18 09:20:31.395680544 +0000 UTC m=+1097.970425424" watchObservedRunningTime="2026-03-18 09:20:31.397947786 +0000 UTC m=+1097.972692666" Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.424229 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" podStartSLOduration=21.731163253 podStartE2EDuration="26.424178034s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:26.227620564 +0000 UTC m=+1092.802365404" lastFinishedPulling="2026-03-18 09:20:30.920635315 +0000 UTC m=+1097.495380185" observedRunningTime="2026-03-18 09:20:31.422055626 +0000 UTC m=+1097.996800496" watchObservedRunningTime="2026-03-18 09:20:31.424178034 +0000 UTC m=+1097.998922894" Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.460553 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" podStartSLOduration=21.769118062 podStartE2EDuration="26.460524068s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:26.230439231 +0000 UTC m=+1092.805184071" lastFinishedPulling="2026-03-18 09:20:30.921845217 +0000 UTC m=+1097.496590077" observedRunningTime="2026-03-18 09:20:31.449709022 +0000 UTC m=+1098.024453872" watchObservedRunningTime="2026-03-18 09:20:31.460524068 +0000 UTC m=+1098.035268918" Mar 18 09:20:32 crc kubenswrapper[4778]: I0318 09:20:32.777257 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.695723 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.716103 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.750732 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.779753 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.784884 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.803904 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.996620 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.010404 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.282056 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.485761 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.667000 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.716529 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.747369 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.787046 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.827935 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:41 crc kubenswrapper[4778]: I0318 09:20:41.730165 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:42 crc kubenswrapper[4778]: I0318 09:20:42.136814 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:43 crc kubenswrapper[4778]: I0318 09:20:43.483988 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" event={"ID":"e245908e-e35e-403c-93f6-48371904ae42","Type":"ContainerStarted","Data":"288761a7e4ffe40fe1649b5ee737a2e3be3b06ba9708ec90175fbe22bdcb1bb7"} Mar 18 09:20:43 crc kubenswrapper[4778]: I0318 09:20:43.484786 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:20:43 crc kubenswrapper[4778]: I0318 09:20:43.521018 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" podStartSLOduration=4.403711626 podStartE2EDuration="38.520976609s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.309668434 +0000 UTC m=+1074.884413274" lastFinishedPulling="2026-03-18 09:20:42.426933387 +0000 UTC m=+1109.001678257" observedRunningTime="2026-03-18 09:20:43.518886832 +0000 UTC m=+1110.093631692" watchObservedRunningTime="2026-03-18 09:20:43.520976609 +0000 UTC m=+1110.095721489" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.514716 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" event={"ID":"b837636e-8c09-42b7-9a81-e7875df68344","Type":"ContainerStarted","Data":"c7761531b0126c6fab9efbb77ef73a19be290edf5217e7c0567d44c31a1ec345"} Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.518572 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" event={"ID":"3c86f76c-1617-45e9-9573-f6fd51803b45","Type":"ContainerStarted","Data":"3588c7cb07a2b0d12d1d41bfb28021419fd61be8c63ed51a50e5827b5369e309"} Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.519622 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.521805 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" event={"ID":"ae690990-eeb1-4871-8c51-dd3b547e1193","Type":"ContainerStarted","Data":"884302ecd2a0d3fb98a1f175a3d1326060ba1dd8333398b9adbdd0e566bd3d2d"} Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.522013 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.524702 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" event={"ID":"e1ec7bae-8e15-4844-84d2-ff5951d0be31","Type":"ContainerStarted","Data":"06333c67b257e873769a57d0d5c9681a935d00536f9890a79c7f46156b018f50"} Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.524997 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.548824 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" podStartSLOduration=2.736765689 podStartE2EDuration="39.548794492s" podCreationTimestamp="2026-03-18 09:20:06 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.078574385 +0000 UTC m=+1074.653319235" lastFinishedPulling="2026-03-18 09:20:44.890603198 +0000 UTC m=+1111.465348038" observedRunningTime="2026-03-18 09:20:45.534822231 +0000 UTC m=+1112.109567151" watchObservedRunningTime="2026-03-18 09:20:45.548794492 +0000 UTC m=+1112.123539362" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.561908 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" podStartSLOduration=4.189084211 podStartE2EDuration="40.561878978s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.010809312 +0000 UTC m=+1074.585554152" lastFinishedPulling="2026-03-18 09:20:44.383604079 +0000 UTC m=+1110.958348919" observedRunningTime="2026-03-18 09:20:45.558385083 +0000 UTC m=+1112.133129993" watchObservedRunningTime="2026-03-18 09:20:45.561878978 +0000 UTC m=+1112.136623818" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.578389 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" podStartSLOduration=3.3602219939999998 podStartE2EDuration="40.578356818s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.163564783 +0000 UTC m=+1073.738309623" lastFinishedPulling="2026-03-18 09:20:44.381699597 +0000 UTC m=+1110.956444447" observedRunningTime="2026-03-18 09:20:45.575095709 +0000 UTC m=+1112.149840559" watchObservedRunningTime="2026-03-18 09:20:45.578356818 +0000 UTC m=+1112.153101698" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.599147 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" podStartSLOduration=3.851720334 podStartE2EDuration="40.599116745s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.637001809 +0000 UTC m=+1074.211746649" lastFinishedPulling="2026-03-18 09:20:44.38439819 +0000 UTC m=+1110.959143060" observedRunningTime="2026-03-18 09:20:45.58982054 +0000 UTC m=+1112.164565440" watchObservedRunningTime="2026-03-18 09:20:45.599116745 +0000 UTC m=+1112.173861625" Mar 18 09:20:55 crc kubenswrapper[4778]: I0318 09:20:55.843329 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:55 crc kubenswrapper[4778]: I0318 09:20:55.872647 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:56 crc kubenswrapper[4778]: I0318 09:20:56.026046 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:56 crc kubenswrapper[4778]: I0318 09:20:56.460820 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.366060 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csxwm"] Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.373513 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.375802 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7szsm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.376240 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.376654 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.376902 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.393337 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csxwm"] Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.460728 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbpnt"] Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.461830 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.467537 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.502587 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbpnt"] Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.517803 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.518127 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-config\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.518371 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n4kw\" (UniqueName: \"kubernetes.io/projected/4c0c899c-e724-4486-bfba-42c7f089cfa7-kube-api-access-5n4kw\") pod \"dnsmasq-dns-675f4bcbfc-csxwm\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.518540 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0c899c-e724-4486-bfba-42c7f089cfa7-config\") pod \"dnsmasq-dns-675f4bcbfc-csxwm\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.518695 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvcvh\" (UniqueName: \"kubernetes.io/projected/ef690ca0-3568-4334-bddc-956b11424d40-kube-api-access-fvcvh\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.620622 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.620956 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-config\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.621100 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n4kw\" (UniqueName: \"kubernetes.io/projected/4c0c899c-e724-4486-bfba-42c7f089cfa7-kube-api-access-5n4kw\") pod \"dnsmasq-dns-675f4bcbfc-csxwm\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.621374 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0c899c-e724-4486-bfba-42c7f089cfa7-config\") pod \"dnsmasq-dns-675f4bcbfc-csxwm\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.622383 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-config\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.622365 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0c899c-e724-4486-bfba-42c7f089cfa7-config\") pod \"dnsmasq-dns-675f4bcbfc-csxwm\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.621941 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.622640 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvcvh\" (UniqueName: \"kubernetes.io/projected/ef690ca0-3568-4334-bddc-956b11424d40-kube-api-access-fvcvh\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.657457 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n4kw\" (UniqueName: \"kubernetes.io/projected/4c0c899c-e724-4486-bfba-42c7f089cfa7-kube-api-access-5n4kw\") pod \"dnsmasq-dns-675f4bcbfc-csxwm\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.658968 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvcvh\" (UniqueName: \"kubernetes.io/projected/ef690ca0-3568-4334-bddc-956b11424d40-kube-api-access-fvcvh\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.693318 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.801219 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:16 crc kubenswrapper[4778]: I0318 09:21:16.023272 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbpnt"] Mar 18 09:21:16 crc kubenswrapper[4778]: W0318 09:21:16.028590 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef690ca0_3568_4334_bddc_956b11424d40.slice/crio-56d81e5e06ecb4723728e490b7892ca9d5f51a511bc3931f818bfe5aaf2487d9 WatchSource:0}: Error finding container 56d81e5e06ecb4723728e490b7892ca9d5f51a511bc3931f818bfe5aaf2487d9: Status 404 returned error can't find the container with id 56d81e5e06ecb4723728e490b7892ca9d5f51a511bc3931f818bfe5aaf2487d9 Mar 18 09:21:16 crc kubenswrapper[4778]: I0318 09:21:16.131419 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csxwm"] Mar 18 09:21:16 crc kubenswrapper[4778]: W0318 09:21:16.136161 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c0c899c_e724_4486_bfba_42c7f089cfa7.slice/crio-4c2bd12d0d178885863f2b67ec45e234d90e570362b4cc7fec292b577d9c4441 WatchSource:0}: Error finding container 4c2bd12d0d178885863f2b67ec45e234d90e570362b4cc7fec292b577d9c4441: Status 404 returned error can't find the container with id 4c2bd12d0d178885863f2b67ec45e234d90e570362b4cc7fec292b577d9c4441 Mar 18 09:21:16 crc kubenswrapper[4778]: I0318 09:21:16.808265 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" event={"ID":"ef690ca0-3568-4334-bddc-956b11424d40","Type":"ContainerStarted","Data":"56d81e5e06ecb4723728e490b7892ca9d5f51a511bc3931f818bfe5aaf2487d9"} Mar 18 09:21:16 crc kubenswrapper[4778]: I0318 09:21:16.810857 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" event={"ID":"4c0c899c-e724-4486-bfba-42c7f089cfa7","Type":"ContainerStarted","Data":"4c2bd12d0d178885863f2b67ec45e234d90e570362b4cc7fec292b577d9c4441"} Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.211997 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csxwm"] Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.244446 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ffznk"] Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.245950 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.255333 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ffznk"] Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.375112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-config\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.375175 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.375248 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7prc\" (UniqueName: \"kubernetes.io/projected/b849baae-7043-48dd-be08-0edde88c7c69-kube-api-access-v7prc\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.477283 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7prc\" (UniqueName: \"kubernetes.io/projected/b849baae-7043-48dd-be08-0edde88c7c69-kube-api-access-v7prc\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.477412 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-config\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.477438 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.478432 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.478742 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-config\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.519018 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7prc\" (UniqueName: \"kubernetes.io/projected/b849baae-7043-48dd-be08-0edde88c7c69-kube-api-access-v7prc\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.539098 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbpnt"] Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.560737 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xvr5c"] Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.566284 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.605221 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.631744 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xvr5c"] Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.709531 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.709582 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-config\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.709854 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chwdv\" (UniqueName: \"kubernetes.io/projected/124c0069-debd-459c-9d66-f38d9d096996-kube-api-access-chwdv\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.811520 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chwdv\" (UniqueName: \"kubernetes.io/projected/124c0069-debd-459c-9d66-f38d9d096996-kube-api-access-chwdv\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.811584 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.811612 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-config\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.812795 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-config\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.813535 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.838391 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chwdv\" (UniqueName: \"kubernetes.io/projected/124c0069-debd-459c-9d66-f38d9d096996-kube-api-access-chwdv\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.999389 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.170670 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ffznk"] Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.419019 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.420393 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426066 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426085 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426219 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426416 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426490 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426599 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426732 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7f9jg" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.436147 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.521977 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.522626 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.522692 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.522970 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523131 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/671fe1be-f3dd-475e-8c48-a1d1db510aef-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523175 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523219 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/671fe1be-f3dd-475e-8c48-a1d1db510aef-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523280 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhk6k\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-kube-api-access-mhk6k\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523426 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523506 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523546 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.625880 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.625966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626011 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/671fe1be-f3dd-475e-8c48-a1d1db510aef-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626032 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626047 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/671fe1be-f3dd-475e-8c48-a1d1db510aef-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626064 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhk6k\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-kube-api-access-mhk6k\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626089 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626105 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626127 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626154 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626173 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.627052 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.627343 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.627904 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.628146 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.629098 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.629590 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.635796 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/671fe1be-f3dd-475e-8c48-a1d1db510aef-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.637545 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.640763 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/671fe1be-f3dd-475e-8c48-a1d1db510aef-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.656043 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.662925 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhk6k\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-kube-api-access-mhk6k\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.668418 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.737164 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.739508 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.743564 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.743616 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.743564 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.743786 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.744474 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.744587 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.744698 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-b2npt" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.755469 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.759560 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830254 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830322 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trc2k\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-kube-api-access-trc2k\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830352 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830394 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-config-data\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830424 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830714 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830746 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830784 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57955df9-f0c5-4cfc-91fd-135771be7ed2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830803 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830898 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57955df9-f0c5-4cfc-91fd-135771be7ed2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.932962 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933046 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933081 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933105 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57955df9-f0c5-4cfc-91fd-135771be7ed2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933248 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933332 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57955df9-f0c5-4cfc-91fd-135771be7ed2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933391 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933427 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trc2k\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-kube-api-access-trc2k\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933455 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933485 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-config-data\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933526 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.934001 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.934733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.935320 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.935649 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.935798 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-config-data\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.936887 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.939233 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57955df9-f0c5-4cfc-91fd-135771be7ed2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.940730 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.941666 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57955df9-f0c5-4cfc-91fd-135771be7ed2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.950932 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.952689 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trc2k\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-kube-api-access-trc2k\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.964276 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.077492 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.731383 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.736111 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.739182 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.739949 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.740093 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-btksr" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.740814 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.753386 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.769016 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.855969 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856022 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856062 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856100 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-config-data-default\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856133 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856166 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-kolla-config\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856311 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsgm9\" (UniqueName: \"kubernetes.io/projected/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-kube-api-access-lsgm9\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856375 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.957901 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-kolla-config\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsgm9\" (UniqueName: \"kubernetes.io/projected/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-kube-api-access-lsgm9\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958062 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958090 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958118 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958151 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958188 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-config-data-default\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958409 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958619 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958771 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-kolla-config\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.959621 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-config-data-default\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.960067 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.979952 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.980655 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.984297 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsgm9\" (UniqueName: \"kubernetes.io/projected/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-kube-api-access-lsgm9\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.999764 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.080145 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.985269 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.990702 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.994096 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-trpxt" Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.994154 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.995434 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.996451 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.997149 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.074720 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvg6l\" (UniqueName: \"kubernetes.io/projected/49ce9560-3ee2-48d2-b016-a9feefb3a798-kube-api-access-jvg6l\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.074798 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.074849 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.074915 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/49ce9560-3ee2-48d2-b016-a9feefb3a798-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.074957 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.075006 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.075043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ce9560-3ee2-48d2-b016-a9feefb3a798-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.075159 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ce9560-3ee2-48d2-b016-a9feefb3a798-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.176821 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.176866 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.176891 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ce9560-3ee2-48d2-b016-a9feefb3a798-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.176930 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ce9560-3ee2-48d2-b016-a9feefb3a798-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.177012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvg6l\" (UniqueName: \"kubernetes.io/projected/49ce9560-3ee2-48d2-b016-a9feefb3a798-kube-api-access-jvg6l\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.177041 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.177062 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.177113 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/49ce9560-3ee2-48d2-b016-a9feefb3a798-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.177700 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.177992 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/49ce9560-3ee2-48d2-b016-a9feefb3a798-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.178519 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.184920 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.193255 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.206162 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ce9560-3ee2-48d2-b016-a9feefb3a798-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.225923 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ce9560-3ee2-48d2-b016-a9feefb3a798-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.237963 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvg6l\" (UniqueName: \"kubernetes.io/projected/49ce9560-3ee2-48d2-b016-a9feefb3a798-kube-api-access-jvg6l\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.253451 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.351306 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.402877 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.405524 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.407774 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-6dj2t" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.414455 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.414711 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.415587 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.482377 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc50d224-cd65-4a46-b3d0-b40acdbda53d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.482432 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc50d224-cd65-4a46-b3d0-b40acdbda53d-config-data\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.482484 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc50d224-cd65-4a46-b3d0-b40acdbda53d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.482577 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrhhv\" (UniqueName: \"kubernetes.io/projected/fc50d224-cd65-4a46-b3d0-b40acdbda53d-kube-api-access-hrhhv\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.482854 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc50d224-cd65-4a46-b3d0-b40acdbda53d-kolla-config\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.584780 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc50d224-cd65-4a46-b3d0-b40acdbda53d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.584826 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc50d224-cd65-4a46-b3d0-b40acdbda53d-config-data\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.584855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc50d224-cd65-4a46-b3d0-b40acdbda53d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.584909 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrhhv\" (UniqueName: \"kubernetes.io/projected/fc50d224-cd65-4a46-b3d0-b40acdbda53d-kube-api-access-hrhhv\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.584948 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc50d224-cd65-4a46-b3d0-b40acdbda53d-kolla-config\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.585927 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc50d224-cd65-4a46-b3d0-b40acdbda53d-config-data\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.586301 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc50d224-cd65-4a46-b3d0-b40acdbda53d-kolla-config\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.592716 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc50d224-cd65-4a46-b3d0-b40acdbda53d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.592873 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc50d224-cd65-4a46-b3d0-b40acdbda53d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.610785 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrhhv\" (UniqueName: \"kubernetes.io/projected/fc50d224-cd65-4a46-b3d0-b40acdbda53d-kube-api-access-hrhhv\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.725331 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 09:21:23 crc kubenswrapper[4778]: I0318 09:21:23.336944 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xvr5c"] Mar 18 09:21:23 crc kubenswrapper[4778]: I0318 09:21:23.890000 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" event={"ID":"b849baae-7043-48dd-be08-0edde88c7c69","Type":"ContainerStarted","Data":"32885a5d9fd1876608960a09778b3825b17dc07db2a1dad6becf112b8746266b"} Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.524687 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.525858 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.528150 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-vmscp" Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.548564 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.633125 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xklgw\" (UniqueName: \"kubernetes.io/projected/45babbce-b5d2-4ad5-8bc2-a5047e777e8d-kube-api-access-xklgw\") pod \"kube-state-metrics-0\" (UID: \"45babbce-b5d2-4ad5-8bc2-a5047e777e8d\") " pod="openstack/kube-state-metrics-0" Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.735415 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xklgw\" (UniqueName: \"kubernetes.io/projected/45babbce-b5d2-4ad5-8bc2-a5047e777e8d-kube-api-access-xklgw\") pod \"kube-state-metrics-0\" (UID: \"45babbce-b5d2-4ad5-8bc2-a5047e777e8d\") " pod="openstack/kube-state-metrics-0" Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.764303 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xklgw\" (UniqueName: \"kubernetes.io/projected/45babbce-b5d2-4ad5-8bc2-a5047e777e8d-kube-api-access-xklgw\") pod \"kube-state-metrics-0\" (UID: \"45babbce-b5d2-4ad5-8bc2-a5047e777e8d\") " pod="openstack/kube-state-metrics-0" Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.846095 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 09:21:26 crc kubenswrapper[4778]: W0318 09:21:26.123663 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod124c0069_debd_459c_9d66_f38d9d096996.slice/crio-81379cb4a6ab1c31c5025b2cadc823fb3d4cb38e6f779d5cd8007da3c790b69a WatchSource:0}: Error finding container 81379cb4a6ab1c31c5025b2cadc823fb3d4cb38e6f779d5cd8007da3c790b69a: Status 404 returned error can't find the container with id 81379cb4a6ab1c31c5025b2cadc823fb3d4cb38e6f779d5cd8007da3c790b69a Mar 18 09:21:26 crc kubenswrapper[4778]: I0318 09:21:26.592397 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 09:21:26 crc kubenswrapper[4778]: I0318 09:21:26.921314 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" event={"ID":"124c0069-debd-459c-9d66-f38d9d096996","Type":"ContainerStarted","Data":"81379cb4a6ab1c31c5025b2cadc823fb3d4cb38e6f779d5cd8007da3c790b69a"} Mar 18 09:21:27 crc kubenswrapper[4778]: I0318 09:21:27.961150 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-djmq6"] Mar 18 09:21:27 crc kubenswrapper[4778]: I0318 09:21:27.986854 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djmq6" Mar 18 09:21:27 crc kubenswrapper[4778]: I0318 09:21:27.997997 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-2qjkd" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.001141 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.001413 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.008685 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djmq6"] Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.041512 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-zrlnv"] Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.043167 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.051304 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zrlnv"] Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.104987 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trqpk\" (UniqueName: \"kubernetes.io/projected/f58533cf-4c57-4c3a-b772-e2a488298d7e-kube-api-access-trqpk\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105058 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89cb0c73-439f-4178-bd96-f50b123bcd8a-scripts\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105082 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw5lj\" (UniqueName: \"kubernetes.io/projected/89cb0c73-439f-4178-bd96-f50b123bcd8a-kube-api-access-lw5lj\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-log\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105144 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58533cf-4c57-4c3a-b772-e2a488298d7e-ovn-controller-tls-certs\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105161 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-run-ovn\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105223 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-run\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105333 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-lib\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105375 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-log-ovn\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105482 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58533cf-4c57-4c3a-b772-e2a488298d7e-scripts\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105521 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58533cf-4c57-4c3a-b772-e2a488298d7e-combined-ca-bundle\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105580 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-etc-ovs\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105699 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-run\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.206835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-run\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.206892 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trqpk\" (UniqueName: \"kubernetes.io/projected/f58533cf-4c57-4c3a-b772-e2a488298d7e-kube-api-access-trqpk\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.206940 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89cb0c73-439f-4178-bd96-f50b123bcd8a-scripts\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.206959 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw5lj\" (UniqueName: \"kubernetes.io/projected/89cb0c73-439f-4178-bd96-f50b123bcd8a-kube-api-access-lw5lj\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.206979 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-log\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207025 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58533cf-4c57-4c3a-b772-e2a488298d7e-ovn-controller-tls-certs\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207042 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-run-ovn\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207065 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-run\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207092 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-lib\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207110 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-log-ovn\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207135 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58533cf-4c57-4c3a-b772-e2a488298d7e-scripts\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207153 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58533cf-4c57-4c3a-b772-e2a488298d7e-combined-ca-bundle\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207175 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-etc-ovs\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207990 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-etc-ovs\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.208141 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-log-ovn\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.208344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-run\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.208460 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-lib\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.211102 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-run\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.211147 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58533cf-4c57-4c3a-b772-e2a488298d7e-scripts\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.211359 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-log\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.211959 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-run-ovn\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.217098 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58533cf-4c57-4c3a-b772-e2a488298d7e-combined-ca-bundle\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.217848 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89cb0c73-439f-4178-bd96-f50b123bcd8a-scripts\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.218265 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58533cf-4c57-4c3a-b772-e2a488298d7e-ovn-controller-tls-certs\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.226967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw5lj\" (UniqueName: \"kubernetes.io/projected/89cb0c73-439f-4178-bd96-f50b123bcd8a-kube-api-access-lw5lj\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.227365 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trqpk\" (UniqueName: \"kubernetes.io/projected/f58533cf-4c57-4c3a-b772-e2a488298d7e-kube-api-access-trqpk\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.334575 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.365332 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.730839 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.732083 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.734671 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-lxb7s" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.735023 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.735218 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.735228 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.735563 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.767773 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817277 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqr9h\" (UniqueName: \"kubernetes.io/projected/495e34ad-2f4d-46de-95e9-37b34a35f2d2-kube-api-access-cqr9h\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817352 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817381 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817405 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/495e34ad-2f4d-46de-95e9-37b34a35f2d2-config\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817420 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817467 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/495e34ad-2f4d-46de-95e9-37b34a35f2d2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817492 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817515 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/495e34ad-2f4d-46de-95e9-37b34a35f2d2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918577 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918630 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918654 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/495e34ad-2f4d-46de-95e9-37b34a35f2d2-config\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918709 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/495e34ad-2f4d-46de-95e9-37b34a35f2d2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918740 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918766 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/495e34ad-2f4d-46de-95e9-37b34a35f2d2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918805 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqr9h\" (UniqueName: \"kubernetes.io/projected/495e34ad-2f4d-46de-95e9-37b34a35f2d2-kube-api-access-cqr9h\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.919012 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.919470 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/495e34ad-2f4d-46de-95e9-37b34a35f2d2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.919589 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/495e34ad-2f4d-46de-95e9-37b34a35f2d2-config\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.920490 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/495e34ad-2f4d-46de-95e9-37b34a35f2d2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.924512 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.925364 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.925672 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.952165 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqr9h\" (UniqueName: \"kubernetes.io/projected/495e34ad-2f4d-46de-95e9-37b34a35f2d2-kube-api-access-cqr9h\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.984694 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:29 crc kubenswrapper[4778]: I0318 09:21:29.112815 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: W0318 09:21:31.420178 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49ce9560_3ee2_48d2_b016_a9feefb3a798.slice/crio-7e2a46cb51b397d310dc7e376a9a7d14cd4c698d1c5c6543d19669ba836b7c87 WatchSource:0}: Error finding container 7e2a46cb51b397d310dc7e376a9a7d14cd4c698d1c5c6543d19669ba836b7c87: Status 404 returned error can't find the container with id 7e2a46cb51b397d310dc7e376a9a7d14cd4c698d1c5c6543d19669ba836b7c87 Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.710682 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.712856 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.726825 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-blthw" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.727061 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.727184 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.727741 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.768633 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.836634 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887472 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887630 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887660 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h4ps\" (UniqueName: \"kubernetes.io/projected/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-kube-api-access-7h4ps\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887694 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887721 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887753 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887774 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.962923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"49ce9560-3ee2-48d2-b016-a9feefb3a798","Type":"ContainerStarted","Data":"7e2a46cb51b397d310dc7e376a9a7d14cd4c698d1c5c6543d19669ba836b7c87"} Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989750 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989789 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989814 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h4ps\" (UniqueName: \"kubernetes.io/projected/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-kube-api-access-7h4ps\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989843 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989876 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989912 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989937 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.990606 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.990687 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.991450 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.991928 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.999130 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.999188 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.999458 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:32 crc kubenswrapper[4778]: I0318 09:21:32.010565 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:32 crc kubenswrapper[4778]: I0318 09:21:32.014763 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h4ps\" (UniqueName: \"kubernetes.io/projected/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-kube-api-access-7h4ps\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:32 crc kubenswrapper[4778]: I0318 09:21:32.055957 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:34 crc kubenswrapper[4778]: W0318 09:21:34.907023 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfadc08e_9e77_4b6f_be89_fc7c726e85b7.slice/crio-fb0fc7f95277ccc99d0867676ec0e6cfd5c0179cd8b716a74e849e369777a77e WatchSource:0}: Error finding container fb0fc7f95277ccc99d0867676ec0e6cfd5c0179cd8b716a74e849e369777a77e: Status 404 returned error can't find the container with id fb0fc7f95277ccc99d0867676ec0e6cfd5c0179cd8b716a74e849e369777a77e Mar 18 09:21:34 crc kubenswrapper[4778]: E0318 09:21:34.943825 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 09:21:34 crc kubenswrapper[4778]: E0318 09:21:34.944142 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5n4kw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-csxwm_openstack(4c0c899c-e724-4486-bfba-42c7f089cfa7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:21:34 crc kubenswrapper[4778]: E0318 09:21:34.945335 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" podUID="4c0c899c-e724-4486-bfba-42c7f089cfa7" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.017077 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cfadc08e-9e77-4b6f-be89-fc7c726e85b7","Type":"ContainerStarted","Data":"fb0fc7f95277ccc99d0867676ec0e6cfd5c0179cd8b716a74e849e369777a77e"} Mar 18 09:21:35 crc kubenswrapper[4778]: E0318 09:21:35.017373 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 09:21:35 crc kubenswrapper[4778]: E0318 09:21:35.017651 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fvcvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-wbpnt_openstack(ef690ca0-3568-4334-bddc-956b11424d40): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:21:35 crc kubenswrapper[4778]: E0318 09:21:35.020510 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" podUID="ef690ca0-3568-4334-bddc-956b11424d40" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.481520 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.482376 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:21:35 crc kubenswrapper[4778]: W0318 09:21:35.488997 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671fe1be_f3dd_475e_8c48_a1d1db510aef.slice/crio-fd11d5ce698fa7411014a038329cc43022813b8f74df72dacf0e69cfc2473b23 WatchSource:0}: Error finding container fd11d5ce698fa7411014a038329cc43022813b8f74df72dacf0e69cfc2473b23: Status 404 returned error can't find the container with id fd11d5ce698fa7411014a038329cc43022813b8f74df72dacf0e69cfc2473b23 Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.654165 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n4kw\" (UniqueName: \"kubernetes.io/projected/4c0c899c-e724-4486-bfba-42c7f089cfa7-kube-api-access-5n4kw\") pod \"4c0c899c-e724-4486-bfba-42c7f089cfa7\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.655234 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0c899c-e724-4486-bfba-42c7f089cfa7-config\") pod \"4c0c899c-e724-4486-bfba-42c7f089cfa7\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.655846 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c0c899c-e724-4486-bfba-42c7f089cfa7-config" (OuterVolumeSpecName: "config") pod "4c0c899c-e724-4486-bfba-42c7f089cfa7" (UID: "4c0c899c-e724-4486-bfba-42c7f089cfa7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.660424 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0c899c-e724-4486-bfba-42c7f089cfa7-kube-api-access-5n4kw" (OuterVolumeSpecName: "kube-api-access-5n4kw") pod "4c0c899c-e724-4486-bfba-42c7f089cfa7" (UID: "4c0c899c-e724-4486-bfba-42c7f089cfa7"). InnerVolumeSpecName "kube-api-access-5n4kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.683849 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.694099 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 09:21:35 crc kubenswrapper[4778]: W0318 09:21:35.707444 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc50d224_cd65_4a46_b3d0_b40acdbda53d.slice/crio-4563c9b3379100c02eff6ea9b1139f424a6cb1b624eea7483fbfdd047b8da04e WatchSource:0}: Error finding container 4563c9b3379100c02eff6ea9b1139f424a6cb1b624eea7483fbfdd047b8da04e: Status 404 returned error can't find the container with id 4563c9b3379100c02eff6ea9b1139f424a6cb1b624eea7483fbfdd047b8da04e Mar 18 09:21:35 crc kubenswrapper[4778]: W0318 09:21:35.712845 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57955df9_f0c5_4cfc_91fd_135771be7ed2.slice/crio-9908701ec8146f186284a44b30a5e2c02918471c220798a76b04c8de271c7850 WatchSource:0}: Error finding container 9908701ec8146f186284a44b30a5e2c02918471c220798a76b04c8de271c7850: Status 404 returned error can't find the container with id 9908701ec8146f186284a44b30a5e2c02918471c220798a76b04c8de271c7850 Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.760815 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n4kw\" (UniqueName: \"kubernetes.io/projected/4c0c899c-e724-4486-bfba-42c7f089cfa7-kube-api-access-5n4kw\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.760882 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0c899c-e724-4486-bfba-42c7f089cfa7-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.795268 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.831411 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djmq6"] Mar 18 09:21:35 crc kubenswrapper[4778]: W0318 09:21:35.833404 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf58533cf_4c57_4c3a_b772_e2a488298d7e.slice/crio-d2ceca23ab2c5fec44125091c66355f512309c6254da6f49d10b25b4e90b4077 WatchSource:0}: Error finding container d2ceca23ab2c5fec44125091c66355f512309c6254da6f49d10b25b4e90b4077: Status 404 returned error can't find the container with id d2ceca23ab2c5fec44125091c66355f512309c6254da6f49d10b25b4e90b4077 Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.835111 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:21:35 crc kubenswrapper[4778]: W0318 09:21:35.845098 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45babbce_b5d2_4ad5_8bc2_a5047e777e8d.slice/crio-aea881ea048223a1a79ed7ce2d76ae73c7092387d05c9eaeaf63d09ce8b8e125 WatchSource:0}: Error finding container aea881ea048223a1a79ed7ce2d76ae73c7092387d05c9eaeaf63d09ce8b8e125: Status 404 returned error can't find the container with id aea881ea048223a1a79ed7ce2d76ae73c7092387d05c9eaeaf63d09ce8b8e125 Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.889536 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.026225 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" event={"ID":"4c0c899c-e724-4486-bfba-42c7f089cfa7","Type":"ContainerDied","Data":"4c2bd12d0d178885863f2b67ec45e234d90e570362b4cc7fec292b577d9c4441"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.026321 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.028901 4778 generic.go:334] "Generic (PLEG): container finished" podID="b849baae-7043-48dd-be08-0edde88c7c69" containerID="480d7c5317737b3e1aeba0fc6a1727b6f65e4c3e2f9bf9812586687ddd61c9ba" exitCode=0 Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.029264 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" event={"ID":"b849baae-7043-48dd-be08-0edde88c7c69","Type":"ContainerDied","Data":"480d7c5317737b3e1aeba0fc6a1727b6f65e4c3e2f9bf9812586687ddd61c9ba"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.030516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"671fe1be-f3dd-475e-8c48-a1d1db510aef","Type":"ContainerStarted","Data":"fd11d5ce698fa7411014a038329cc43022813b8f74df72dacf0e69cfc2473b23"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.036518 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"495e34ad-2f4d-46de-95e9-37b34a35f2d2","Type":"ContainerStarted","Data":"a2731392612b32d712d0c6d5c193713b53b7b3bb945d54484f35a26488fff227"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.037703 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fc50d224-cd65-4a46-b3d0-b40acdbda53d","Type":"ContainerStarted","Data":"4563c9b3379100c02eff6ea9b1139f424a6cb1b624eea7483fbfdd047b8da04e"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.041324 4778 generic.go:334] "Generic (PLEG): container finished" podID="124c0069-debd-459c-9d66-f38d9d096996" containerID="5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c" exitCode=0 Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.041450 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" event={"ID":"124c0069-debd-459c-9d66-f38d9d096996","Type":"ContainerDied","Data":"5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.046979 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djmq6" event={"ID":"f58533cf-4c57-4c3a-b772-e2a488298d7e","Type":"ContainerStarted","Data":"d2ceca23ab2c5fec44125091c66355f512309c6254da6f49d10b25b4e90b4077"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.052792 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa","Type":"ContainerStarted","Data":"cf2129c3bfbc296aad0527dd8ce0ba04e342011cc58c1e9bbaefda0cf47445b6"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.067819 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57955df9-f0c5-4cfc-91fd-135771be7ed2","Type":"ContainerStarted","Data":"9908701ec8146f186284a44b30a5e2c02918471c220798a76b04c8de271c7850"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.084560 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"45babbce-b5d2-4ad5-8bc2-a5047e777e8d","Type":"ContainerStarted","Data":"aea881ea048223a1a79ed7ce2d76ae73c7092387d05c9eaeaf63d09ce8b8e125"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.246947 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csxwm"] Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.258862 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csxwm"] Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.545939 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.675646 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvcvh\" (UniqueName: \"kubernetes.io/projected/ef690ca0-3568-4334-bddc-956b11424d40-kube-api-access-fvcvh\") pod \"ef690ca0-3568-4334-bddc-956b11424d40\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.675717 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-config\") pod \"ef690ca0-3568-4334-bddc-956b11424d40\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.675797 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-dns-svc\") pod \"ef690ca0-3568-4334-bddc-956b11424d40\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.676691 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-config" (OuterVolumeSpecName: "config") pod "ef690ca0-3568-4334-bddc-956b11424d40" (UID: "ef690ca0-3568-4334-bddc-956b11424d40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.677243 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef690ca0-3568-4334-bddc-956b11424d40" (UID: "ef690ca0-3568-4334-bddc-956b11424d40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.686365 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef690ca0-3568-4334-bddc-956b11424d40-kube-api-access-fvcvh" (OuterVolumeSpecName: "kube-api-access-fvcvh") pod "ef690ca0-3568-4334-bddc-956b11424d40" (UID: "ef690ca0-3568-4334-bddc-956b11424d40"). InnerVolumeSpecName "kube-api-access-fvcvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.750292 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zrlnv"] Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.777165 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvcvh\" (UniqueName: \"kubernetes.io/projected/ef690ca0-3568-4334-bddc-956b11424d40-kube-api-access-fvcvh\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.777231 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.777246 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.095908 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zrlnv" event={"ID":"89cb0c73-439f-4178-bd96-f50b123bcd8a","Type":"ContainerStarted","Data":"3253b8eecb6c9436abd551deb3597dc9dea83fbfeb9a51958d9c12c740b797f5"} Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.099031 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" event={"ID":"ef690ca0-3568-4334-bddc-956b11424d40","Type":"ContainerDied","Data":"56d81e5e06ecb4723728e490b7892ca9d5f51a511bc3931f818bfe5aaf2487d9"} Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.099168 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.108482 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" event={"ID":"124c0069-debd-459c-9d66-f38d9d096996","Type":"ContainerStarted","Data":"1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e"} Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.108997 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.112647 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" event={"ID":"b849baae-7043-48dd-be08-0edde88c7c69","Type":"ContainerStarted","Data":"f8db45d45042eecaafb866cb991b1dc5586cee69f57e924f2a599715ace70e35"} Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.113072 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.130168 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" podStartSLOduration=10.116715644 podStartE2EDuration="19.130149514s" podCreationTimestamp="2026-03-18 09:21:18 +0000 UTC" firstStartedPulling="2026-03-18 09:21:26.126346793 +0000 UTC m=+1152.701091633" lastFinishedPulling="2026-03-18 09:21:35.139780653 +0000 UTC m=+1161.714525503" observedRunningTime="2026-03-18 09:21:37.129853056 +0000 UTC m=+1163.704597906" watchObservedRunningTime="2026-03-18 09:21:37.130149514 +0000 UTC m=+1163.704894344" Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.170245 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbpnt"] Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.176628 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbpnt"] Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.183748 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" podStartSLOduration=6.9876135040000005 podStartE2EDuration="19.183732747s" podCreationTimestamp="2026-03-18 09:21:18 +0000 UTC" firstStartedPulling="2026-03-18 09:21:22.924781775 +0000 UTC m=+1149.499526615" lastFinishedPulling="2026-03-18 09:21:35.120901018 +0000 UTC m=+1161.695645858" observedRunningTime="2026-03-18 09:21:37.179786548 +0000 UTC m=+1163.754531408" watchObservedRunningTime="2026-03-18 09:21:37.183732747 +0000 UTC m=+1163.758477577" Mar 18 09:21:38 crc kubenswrapper[4778]: I0318 09:21:38.199132 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0c899c-e724-4486-bfba-42c7f089cfa7" path="/var/lib/kubelet/pods/4c0c899c-e724-4486-bfba-42c7f089cfa7/volumes" Mar 18 09:21:38 crc kubenswrapper[4778]: I0318 09:21:38.199808 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef690ca0-3568-4334-bddc-956b11424d40" path="/var/lib/kubelet/pods/ef690ca0-3568-4334-bddc-956b11424d40/volumes" Mar 18 09:21:43 crc kubenswrapper[4778]: I0318 09:21:43.610048 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:44 crc kubenswrapper[4778]: I0318 09:21:44.002466 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:44 crc kubenswrapper[4778]: I0318 09:21:44.077940 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ffznk"] Mar 18 09:21:44 crc kubenswrapper[4778]: I0318 09:21:44.170303 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" podUID="b849baae-7043-48dd-be08-0edde88c7c69" containerName="dnsmasq-dns" containerID="cri-o://f8db45d45042eecaafb866cb991b1dc5586cee69f57e924f2a599715ace70e35" gracePeriod=10 Mar 18 09:21:45 crc kubenswrapper[4778]: I0318 09:21:45.182071 4778 generic.go:334] "Generic (PLEG): container finished" podID="b849baae-7043-48dd-be08-0edde88c7c69" containerID="f8db45d45042eecaafb866cb991b1dc5586cee69f57e924f2a599715ace70e35" exitCode=0 Mar 18 09:21:45 crc kubenswrapper[4778]: I0318 09:21:45.182163 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" event={"ID":"b849baae-7043-48dd-be08-0edde88c7c69","Type":"ContainerDied","Data":"f8db45d45042eecaafb866cb991b1dc5586cee69f57e924f2a599715ace70e35"} Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.708136 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.888242 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-dns-svc\") pod \"b849baae-7043-48dd-be08-0edde88c7c69\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.888365 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7prc\" (UniqueName: \"kubernetes.io/projected/b849baae-7043-48dd-be08-0edde88c7c69-kube-api-access-v7prc\") pod \"b849baae-7043-48dd-be08-0edde88c7c69\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.888425 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-config\") pod \"b849baae-7043-48dd-be08-0edde88c7c69\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.893026 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b849baae-7043-48dd-be08-0edde88c7c69-kube-api-access-v7prc" (OuterVolumeSpecName: "kube-api-access-v7prc") pod "b849baae-7043-48dd-be08-0edde88c7c69" (UID: "b849baae-7043-48dd-be08-0edde88c7c69"). InnerVolumeSpecName "kube-api-access-v7prc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.922509 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b849baae-7043-48dd-be08-0edde88c7c69" (UID: "b849baae-7043-48dd-be08-0edde88c7c69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.926399 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-config" (OuterVolumeSpecName: "config") pod "b849baae-7043-48dd-be08-0edde88c7c69" (UID: "b849baae-7043-48dd-be08-0edde88c7c69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.990499 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7prc\" (UniqueName: \"kubernetes.io/projected/b849baae-7043-48dd-be08-0edde88c7c69-kube-api-access-v7prc\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.990559 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.990575 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:47 crc kubenswrapper[4778]: I0318 09:21:47.202850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" event={"ID":"b849baae-7043-48dd-be08-0edde88c7c69","Type":"ContainerDied","Data":"32885a5d9fd1876608960a09778b3825b17dc07db2a1dad6becf112b8746266b"} Mar 18 09:21:47 crc kubenswrapper[4778]: I0318 09:21:47.202898 4778 scope.go:117] "RemoveContainer" containerID="f8db45d45042eecaafb866cb991b1dc5586cee69f57e924f2a599715ace70e35" Mar 18 09:21:47 crc kubenswrapper[4778]: I0318 09:21:47.203030 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:47 crc kubenswrapper[4778]: I0318 09:21:47.365661 4778 scope.go:117] "RemoveContainer" containerID="480d7c5317737b3e1aeba0fc6a1727b6f65e4c3e2f9bf9812586687ddd61c9ba" Mar 18 09:21:47 crc kubenswrapper[4778]: I0318 09:21:47.450376 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ffznk"] Mar 18 09:21:47 crc kubenswrapper[4778]: I0318 09:21:47.462133 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ffznk"] Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.198511 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b849baae-7043-48dd-be08-0edde88c7c69" path="/var/lib/kubelet/pods/b849baae-7043-48dd-be08-0edde88c7c69/volumes" Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.216568 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"49ce9560-3ee2-48d2-b016-a9feefb3a798","Type":"ContainerStarted","Data":"e466796f0e2910694ab8337ebde4ff7c0ee96e7134f39a9f2c45f915fab1baea"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.219078 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"45babbce-b5d2-4ad5-8bc2-a5047e777e8d","Type":"ContainerStarted","Data":"3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.219279 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.220704 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"495e34ad-2f4d-46de-95e9-37b34a35f2d2","Type":"ContainerStarted","Data":"22aaa7f3cb8f55c070d8fd5d609fc29b14dabd44a738ffe416442d903d142eb9"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.222278 4778 generic.go:334] "Generic (PLEG): container finished" podID="89cb0c73-439f-4178-bd96-f50b123bcd8a" containerID="83d3eeb91dd4b140efab29c838ac8560e0261dbb0dfbb67f93c2d05a75aff55f" exitCode=0 Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.222430 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zrlnv" event={"ID":"89cb0c73-439f-4178-bd96-f50b123bcd8a","Type":"ContainerDied","Data":"83d3eeb91dd4b140efab29c838ac8560e0261dbb0dfbb67f93c2d05a75aff55f"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.226458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djmq6" event={"ID":"f58533cf-4c57-4c3a-b772-e2a488298d7e","Type":"ContainerStarted","Data":"916a2309a194229e2e394cdbba50febc7592c5dcb8bd37a5590e8a134958e250"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.227090 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-djmq6" Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.235026 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa","Type":"ContainerStarted","Data":"cfb1c6cc4c019330e955211c68e17b78588bcdd17ac1ca2f93074744bc3c1bfb"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.236746 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cfadc08e-9e77-4b6f-be89-fc7c726e85b7","Type":"ContainerStarted","Data":"9ce149bc04a6fba1a6cb5e9273cbc389efc89e1e3e6563d1e314e619d97696a8"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.241995 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fc50d224-cd65-4a46-b3d0-b40acdbda53d","Type":"ContainerStarted","Data":"e110d5dc54c472940a9a343eaf5e6004cfb943d1a192a88ab0682ef45891638a"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.242267 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.271974 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.909123276 podStartE2EDuration="24.271950148s" podCreationTimestamp="2026-03-18 09:21:24 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.857977853 +0000 UTC m=+1162.432722693" lastFinishedPulling="2026-03-18 09:21:47.220804725 +0000 UTC m=+1173.795549565" observedRunningTime="2026-03-18 09:21:48.262648384 +0000 UTC m=+1174.837393224" watchObservedRunningTime="2026-03-18 09:21:48.271950148 +0000 UTC m=+1174.846694988" Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.289789 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-djmq6" podStartSLOduration=10.562483617 podStartE2EDuration="21.289764484s" podCreationTimestamp="2026-03-18 09:21:27 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.83695941 +0000 UTC m=+1162.411704250" lastFinishedPulling="2026-03-18 09:21:46.564240287 +0000 UTC m=+1173.138985117" observedRunningTime="2026-03-18 09:21:48.286886865 +0000 UTC m=+1174.861631715" watchObservedRunningTime="2026-03-18 09:21:48.289764484 +0000 UTC m=+1174.864509324" Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.365757 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.909660436 podStartE2EDuration="26.365737966s" podCreationTimestamp="2026-03-18 09:21:22 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.711847007 +0000 UTC m=+1162.286591847" lastFinishedPulling="2026-03-18 09:21:46.167924537 +0000 UTC m=+1172.742669377" observedRunningTime="2026-03-18 09:21:48.362563889 +0000 UTC m=+1174.937308769" watchObservedRunningTime="2026-03-18 09:21:48.365737966 +0000 UTC m=+1174.940482806" Mar 18 09:21:49 crc kubenswrapper[4778]: I0318 09:21:49.251477 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57955df9-f0c5-4cfc-91fd-135771be7ed2","Type":"ContainerStarted","Data":"3698e124eba53a15e3f16dfe6346805545443e4b8ce94d12254d508326508979"} Mar 18 09:21:49 crc kubenswrapper[4778]: I0318 09:21:49.254129 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"671fe1be-f3dd-475e-8c48-a1d1db510aef","Type":"ContainerStarted","Data":"fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7"} Mar 18 09:21:49 crc kubenswrapper[4778]: I0318 09:21:49.257550 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zrlnv" event={"ID":"89cb0c73-439f-4178-bd96-f50b123bcd8a","Type":"ContainerStarted","Data":"aa61b34dff12ea16006419adc618717fc2a790c20d7f1936c9ccab067065f522"} Mar 18 09:21:49 crc kubenswrapper[4778]: I0318 09:21:49.257578 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zrlnv" event={"ID":"89cb0c73-439f-4178-bd96-f50b123bcd8a","Type":"ContainerStarted","Data":"18d6161525c304e329e2b376f46994f03d27fde45d318a3922ea711c3d1191a2"} Mar 18 09:21:49 crc kubenswrapper[4778]: I0318 09:21:49.307980 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-zrlnv" podStartSLOduration=12.858162606 podStartE2EDuration="22.307932616s" podCreationTimestamp="2026-03-18 09:21:27 +0000 UTC" firstStartedPulling="2026-03-18 09:21:36.762249409 +0000 UTC m=+1163.336994249" lastFinishedPulling="2026-03-18 09:21:46.212019409 +0000 UTC m=+1172.786764259" observedRunningTime="2026-03-18 09:21:49.307303189 +0000 UTC m=+1175.882048039" watchObservedRunningTime="2026-03-18 09:21:49.307932616 +0000 UTC m=+1175.882677456" Mar 18 09:21:50 crc kubenswrapper[4778]: I0318 09:21:50.266252 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:50 crc kubenswrapper[4778]: I0318 09:21:50.266851 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:51 crc kubenswrapper[4778]: I0318 09:21:51.277999 4778 generic.go:334] "Generic (PLEG): container finished" podID="49ce9560-3ee2-48d2-b016-a9feefb3a798" containerID="e466796f0e2910694ab8337ebde4ff7c0ee96e7134f39a9f2c45f915fab1baea" exitCode=0 Mar 18 09:21:51 crc kubenswrapper[4778]: I0318 09:21:51.278086 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"49ce9560-3ee2-48d2-b016-a9feefb3a798","Type":"ContainerDied","Data":"e466796f0e2910694ab8337ebde4ff7c0ee96e7134f39a9f2c45f915fab1baea"} Mar 18 09:21:51 crc kubenswrapper[4778]: I0318 09:21:51.283052 4778 generic.go:334] "Generic (PLEG): container finished" podID="cfadc08e-9e77-4b6f-be89-fc7c726e85b7" containerID="9ce149bc04a6fba1a6cb5e9273cbc389efc89e1e3e6563d1e314e619d97696a8" exitCode=0 Mar 18 09:21:51 crc kubenswrapper[4778]: I0318 09:21:51.283146 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cfadc08e-9e77-4b6f-be89-fc7c726e85b7","Type":"ContainerDied","Data":"9ce149bc04a6fba1a6cb5e9273cbc389efc89e1e3e6563d1e314e619d97696a8"} Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.295892 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"49ce9560-3ee2-48d2-b016-a9feefb3a798","Type":"ContainerStarted","Data":"a00f6e5d8b705b53000302e3ddae59a971386417b12cf0ca55d037a9eebb3804"} Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.299840 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cfadc08e-9e77-4b6f-be89-fc7c726e85b7","Type":"ContainerStarted","Data":"09d96c8246f4a209ab57e5f9ba16639880458ea9735ca72008c97464471540af"} Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.302591 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"495e34ad-2f4d-46de-95e9-37b34a35f2d2","Type":"ContainerStarted","Data":"13d9ec5ff76e3f55f69c1fe037a0b107797202b066241284d892f1df1f0a163d"} Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.306887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa","Type":"ContainerStarted","Data":"85ed145be914a5d426246153108571aab2c58751d7c73029061c8ae09d1bd58b"} Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.325497 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=17.186349948 podStartE2EDuration="32.325470646s" podCreationTimestamp="2026-03-18 09:21:20 +0000 UTC" firstStartedPulling="2026-03-18 09:21:31.425409986 +0000 UTC m=+1158.000154826" lastFinishedPulling="2026-03-18 09:21:46.564530684 +0000 UTC m=+1173.139275524" observedRunningTime="2026-03-18 09:21:52.323183554 +0000 UTC m=+1178.897928424" watchObservedRunningTime="2026-03-18 09:21:52.325470646 +0000 UTC m=+1178.900215506" Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.352434 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.352508 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.365921 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.593062114 podStartE2EDuration="25.365883358s" podCreationTimestamp="2026-03-18 09:21:27 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.845878733 +0000 UTC m=+1162.420623573" lastFinishedPulling="2026-03-18 09:21:51.618699957 +0000 UTC m=+1178.193444817" observedRunningTime="2026-03-18 09:21:52.358820036 +0000 UTC m=+1178.933564936" watchObservedRunningTime="2026-03-18 09:21:52.365883358 +0000 UTC m=+1178.940628238" Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.395858 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.675569015 podStartE2EDuration="22.395830405s" podCreationTimestamp="2026-03-18 09:21:30 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.898397736 +0000 UTC m=+1162.473142576" lastFinishedPulling="2026-03-18 09:21:51.618659126 +0000 UTC m=+1178.193403966" observedRunningTime="2026-03-18 09:21:52.38170728 +0000 UTC m=+1178.956452130" watchObservedRunningTime="2026-03-18 09:21:52.395830405 +0000 UTC m=+1178.970575275" Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.420000 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.200039321 podStartE2EDuration="33.419975573s" podCreationTimestamp="2026-03-18 09:21:19 +0000 UTC" firstStartedPulling="2026-03-18 09:21:34.909182313 +0000 UTC m=+1161.483927193" lastFinishedPulling="2026-03-18 09:21:47.129118605 +0000 UTC m=+1173.703863445" observedRunningTime="2026-03-18 09:21:52.404742388 +0000 UTC m=+1178.979487268" watchObservedRunningTime="2026-03-18 09:21:52.419975573 +0000 UTC m=+1178.994720433" Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.729487 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.056967 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.101514 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.113358 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.163260 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.316779 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.318088 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.363625 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.389892 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.648699 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-gdjvv"] Mar 18 09:21:53 crc kubenswrapper[4778]: E0318 09:21:53.649179 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b849baae-7043-48dd-be08-0edde88c7c69" containerName="init" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.649233 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b849baae-7043-48dd-be08-0edde88c7c69" containerName="init" Mar 18 09:21:53 crc kubenswrapper[4778]: E0318 09:21:53.649263 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b849baae-7043-48dd-be08-0edde88c7c69" containerName="dnsmasq-dns" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.649271 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b849baae-7043-48dd-be08-0edde88c7c69" containerName="dnsmasq-dns" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.649420 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b849baae-7043-48dd-be08-0edde88c7c69" containerName="dnsmasq-dns" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.650225 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.652450 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.662068 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-gdjvv"] Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.727981 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.728034 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vtq4\" (UniqueName: \"kubernetes.io/projected/be59e02d-1aa2-4a26-b261-4f837e555f2d-kube-api-access-7vtq4\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.728151 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.728183 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-config\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.835246 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vtq4\" (UniqueName: \"kubernetes.io/projected/be59e02d-1aa2-4a26-b261-4f837e555f2d-kube-api-access-7vtq4\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.835621 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.835684 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-config\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.835785 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.836703 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.836740 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.837298 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-config\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.870807 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vtq4\" (UniqueName: \"kubernetes.io/projected/be59e02d-1aa2-4a26-b261-4f837e555f2d-kube-api-access-7vtq4\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.878806 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-2ldk7"] Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.879887 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.884302 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.899615 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-2ldk7"] Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.936690 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-config\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.936746 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-ovs-rundir\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.936796 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqntm\" (UniqueName: \"kubernetes.io/projected/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-kube-api-access-qqntm\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.936828 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.936850 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-combined-ca-bundle\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.936864 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-ovn-rundir\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.961046 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-gdjvv"] Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.961649 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.998031 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-jt8gb"] Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.002447 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.006894 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.030925 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jt8gb"] Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.043417 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqntm\" (UniqueName: \"kubernetes.io/projected/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-kube-api-access-qqntm\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.043506 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.043541 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-combined-ca-bundle\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.043565 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-ovn-rundir\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.043634 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-config\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.043671 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-ovs-rundir\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.044074 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-ovs-rundir\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.044659 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-ovn-rundir\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.045265 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-config\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.052465 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.053848 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.053919 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-combined-ca-bundle\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.053925 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.063994 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.064208 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.064349 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-q6qgm" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.064456 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.078610 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqntm\" (UniqueName: \"kubernetes.io/projected/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-kube-api-access-qqntm\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.097547 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146746 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146785 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-scripts\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146811 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-config\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146831 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr7b2\" (UniqueName: \"kubernetes.io/projected/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-kube-api-access-xr7b2\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146869 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146888 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqgnd\" (UniqueName: \"kubernetes.io/projected/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-kube-api-access-qqgnd\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146940 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146959 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-dns-svc\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146980 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-config\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.147012 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.147028 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.230816 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248373 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248625 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248668 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248692 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-scripts\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248721 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-config\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248747 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr7b2\" (UniqueName: \"kubernetes.io/projected/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-kube-api-access-xr7b2\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248794 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248815 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqgnd\" (UniqueName: \"kubernetes.io/projected/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-kube-api-access-qqgnd\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248842 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248872 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248888 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-dns-svc\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248913 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-config\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.249885 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-config\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.252004 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.252004 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-scripts\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.252465 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.252621 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-config\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.252997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.253058 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-dns-svc\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.253708 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.261137 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.261988 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.273635 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr7b2\" (UniqueName: \"kubernetes.io/projected/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-kube-api-access-xr7b2\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.276591 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqgnd\" (UniqueName: \"kubernetes.io/projected/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-kube-api-access-qqgnd\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.435773 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.469649 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.494094 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-2ldk7"] Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.538468 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-gdjvv"] Mar 18 09:21:54 crc kubenswrapper[4778]: W0318 09:21:54.570425 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe59e02d_1aa2_4a26_b261_4f837e555f2d.slice/crio-886cae58e9b849d83802ee865131972663fcca4faa247e187de8d3fbac77a4a2 WatchSource:0}: Error finding container 886cae58e9b849d83802ee865131972663fcca4faa247e187de8d3fbac77a4a2: Status 404 returned error can't find the container with id 886cae58e9b849d83802ee865131972663fcca4faa247e187de8d3fbac77a4a2 Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.854031 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.958147 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jt8gb"] Mar 18 09:21:54 crc kubenswrapper[4778]: W0318 09:21:54.960037 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd539dcf0_c5ce_4c6c_b367_e5c3d7dac5d5.slice/crio-95b556045a6555de0e6526548932d8f77419a56ad86f2e3afbe7000b4b9694c6 WatchSource:0}: Error finding container 95b556045a6555de0e6526548932d8f77419a56ad86f2e3afbe7000b4b9694c6: Status 404 returned error can't find the container with id 95b556045a6555de0e6526548932d8f77419a56ad86f2e3afbe7000b4b9694c6 Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.967542 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.361664 4778 generic.go:334] "Generic (PLEG): container finished" podID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerID="67e25fe46d44f377f33f04f4df4c1bf2f96243fad2d8a6135f426851e5ac8c58" exitCode=0 Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.361714 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jt8gb" event={"ID":"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5","Type":"ContainerDied","Data":"67e25fe46d44f377f33f04f4df4c1bf2f96243fad2d8a6135f426851e5ac8c58"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.361944 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jt8gb" event={"ID":"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5","Type":"ContainerStarted","Data":"95b556045a6555de0e6526548932d8f77419a56ad86f2e3afbe7000b4b9694c6"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.363795 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-2ldk7" event={"ID":"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9","Type":"ContainerStarted","Data":"ffb886d6d76f714c8705f070e5e3d961cd18629f5b516482dd69f4f1977b285f"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.363843 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-2ldk7" event={"ID":"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9","Type":"ContainerStarted","Data":"9763fefc4f9d620f8f7669863a497795208b6498d97df55c051aa149c9720814"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.365319 4778 generic.go:334] "Generic (PLEG): container finished" podID="be59e02d-1aa2-4a26-b261-4f837e555f2d" containerID="de05fcadb2a51526a0dae5a25b8dfc1f6d83df10f0d77a3201f98be36fc9ca83" exitCode=0 Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.365371 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" event={"ID":"be59e02d-1aa2-4a26-b261-4f837e555f2d","Type":"ContainerDied","Data":"de05fcadb2a51526a0dae5a25b8dfc1f6d83df10f0d77a3201f98be36fc9ca83"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.365387 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" event={"ID":"be59e02d-1aa2-4a26-b261-4f837e555f2d","Type":"ContainerStarted","Data":"886cae58e9b849d83802ee865131972663fcca4faa247e187de8d3fbac77a4a2"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.366403 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0","Type":"ContainerStarted","Data":"967590a1812cba79ddc734fdd7b654f806eaeb73d07a4f802ae6196fe173eaec"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.400782 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-2ldk7" podStartSLOduration=2.400764 podStartE2EDuration="2.400764s" podCreationTimestamp="2026-03-18 09:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:21:55.395383275 +0000 UTC m=+1181.970128125" watchObservedRunningTime="2026-03-18 09:21:55.400764 +0000 UTC m=+1181.975508850" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.690267 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.775841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-ovsdbserver-sb\") pod \"be59e02d-1aa2-4a26-b261-4f837e555f2d\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.776687 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-config\") pod \"be59e02d-1aa2-4a26-b261-4f837e555f2d\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.776730 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vtq4\" (UniqueName: \"kubernetes.io/projected/be59e02d-1aa2-4a26-b261-4f837e555f2d-kube-api-access-7vtq4\") pod \"be59e02d-1aa2-4a26-b261-4f837e555f2d\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.776845 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-dns-svc\") pod \"be59e02d-1aa2-4a26-b261-4f837e555f2d\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.782462 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be59e02d-1aa2-4a26-b261-4f837e555f2d-kube-api-access-7vtq4" (OuterVolumeSpecName: "kube-api-access-7vtq4") pod "be59e02d-1aa2-4a26-b261-4f837e555f2d" (UID: "be59e02d-1aa2-4a26-b261-4f837e555f2d"). InnerVolumeSpecName "kube-api-access-7vtq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.795504 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "be59e02d-1aa2-4a26-b261-4f837e555f2d" (UID: "be59e02d-1aa2-4a26-b261-4f837e555f2d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.797980 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "be59e02d-1aa2-4a26-b261-4f837e555f2d" (UID: "be59e02d-1aa2-4a26-b261-4f837e555f2d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.806274 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-config" (OuterVolumeSpecName: "config") pod "be59e02d-1aa2-4a26-b261-4f837e555f2d" (UID: "be59e02d-1aa2-4a26-b261-4f837e555f2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.878923 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.878959 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.878969 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vtq4\" (UniqueName: \"kubernetes.io/projected/be59e02d-1aa2-4a26-b261-4f837e555f2d-kube-api-access-7vtq4\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.878983 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.376140 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0","Type":"ContainerStarted","Data":"80586af0325ebbab3b0342dd081ce20efed8640a270588eda509775f80d4ba92"} Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.378661 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jt8gb" event={"ID":"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5","Type":"ContainerStarted","Data":"0cb5262a9d3d057ca53ec602e21c523692c8679333829bacc24282be5307d258"} Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.378818 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.380070 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.380107 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" event={"ID":"be59e02d-1aa2-4a26-b261-4f837e555f2d","Type":"ContainerDied","Data":"886cae58e9b849d83802ee865131972663fcca4faa247e187de8d3fbac77a4a2"} Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.380132 4778 scope.go:117] "RemoveContainer" containerID="de05fcadb2a51526a0dae5a25b8dfc1f6d83df10f0d77a3201f98be36fc9ca83" Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.409556 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-jt8gb" podStartSLOduration=3.409532168 podStartE2EDuration="3.409532168s" podCreationTimestamp="2026-03-18 09:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:21:56.404433078 +0000 UTC m=+1182.979177938" watchObservedRunningTime="2026-03-18 09:21:56.409532168 +0000 UTC m=+1182.984277038" Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.466315 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-gdjvv"] Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.470149 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-gdjvv"] Mar 18 09:21:57 crc kubenswrapper[4778]: I0318 09:21:57.389633 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0","Type":"ContainerStarted","Data":"9cb9f8c938b08681165a7e0ffa0dd54d6c223ddeec4f61e609b3e802c5cd34cc"} Mar 18 09:21:57 crc kubenswrapper[4778]: I0318 09:21:57.389828 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 09:21:57 crc kubenswrapper[4778]: I0318 09:21:57.422618 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.289491122 podStartE2EDuration="3.42260298s" podCreationTimestamp="2026-03-18 09:21:54 +0000 UTC" firstStartedPulling="2026-03-18 09:21:54.971127442 +0000 UTC m=+1181.545872282" lastFinishedPulling="2026-03-18 09:21:56.1042393 +0000 UTC m=+1182.678984140" observedRunningTime="2026-03-18 09:21:57.418732395 +0000 UTC m=+1183.993477245" watchObservedRunningTime="2026-03-18 09:21:57.42260298 +0000 UTC m=+1183.997347820" Mar 18 09:21:58 crc kubenswrapper[4778]: I0318 09:21:58.195737 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be59e02d-1aa2-4a26-b261-4f837e555f2d" path="/var/lib/kubelet/pods/be59e02d-1aa2-4a26-b261-4f837e555f2d/volumes" Mar 18 09:21:58 crc kubenswrapper[4778]: I0318 09:21:58.488045 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:58 crc kubenswrapper[4778]: I0318 09:21:58.602992 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.141712 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563762-nk868"] Mar 18 09:22:00 crc kubenswrapper[4778]: E0318 09:22:00.142414 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be59e02d-1aa2-4a26-b261-4f837e555f2d" containerName="init" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.142430 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be59e02d-1aa2-4a26-b261-4f837e555f2d" containerName="init" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.142605 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="be59e02d-1aa2-4a26-b261-4f837e555f2d" containerName="init" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.143130 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.146532 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.146746 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.147619 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.147708 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.158737 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.160878 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563762-nk868"] Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.269839 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhg95\" (UniqueName: \"kubernetes.io/projected/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d-kube-api-access-dhg95\") pod \"auto-csr-approver-29563762-nk868\" (UID: \"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d\") " pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.372289 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhg95\" (UniqueName: \"kubernetes.io/projected/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d-kube-api-access-dhg95\") pod \"auto-csr-approver-29563762-nk868\" (UID: \"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d\") " pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.396734 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhg95\" (UniqueName: \"kubernetes.io/projected/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d-kube-api-access-dhg95\") pod \"auto-csr-approver-29563762-nk868\" (UID: \"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d\") " pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.473012 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.986814 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563762-nk868"] Mar 18 09:22:00 crc kubenswrapper[4778]: W0318 09:22:00.991105 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bcf145e_ae6a_4674_9f79_b6486ec2fa9d.slice/crio-0d4f08f11dfe1809584f3cab0e3d071c5c19f66ebf3a573a7b3edd6c9c3b4163 WatchSource:0}: Error finding container 0d4f08f11dfe1809584f3cab0e3d071c5c19f66ebf3a573a7b3edd6c9c3b4163: Status 404 returned error can't find the container with id 0d4f08f11dfe1809584f3cab0e3d071c5c19f66ebf3a573a7b3edd6c9c3b4163 Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.050054 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-nnzs2"] Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.051559 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.054148 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.067463 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nnzs2"] Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.082381 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.082434 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.192512 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tmwp\" (UniqueName: \"kubernetes.io/projected/ae95589d-d3fb-4254-9fd0-f59203e0e927-kube-api-access-8tmwp\") pod \"root-account-create-update-nnzs2\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.192895 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae95589d-d3fb-4254-9fd0-f59203e0e927-operator-scripts\") pod \"root-account-create-update-nnzs2\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.197660 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.294951 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tmwp\" (UniqueName: \"kubernetes.io/projected/ae95589d-d3fb-4254-9fd0-f59203e0e927-kube-api-access-8tmwp\") pod \"root-account-create-update-nnzs2\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.295042 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae95589d-d3fb-4254-9fd0-f59203e0e927-operator-scripts\") pod \"root-account-create-update-nnzs2\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.296297 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae95589d-d3fb-4254-9fd0-f59203e0e927-operator-scripts\") pod \"root-account-create-update-nnzs2\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.322697 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tmwp\" (UniqueName: \"kubernetes.io/projected/ae95589d-d3fb-4254-9fd0-f59203e0e927-kube-api-access-8tmwp\") pod \"root-account-create-update-nnzs2\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.394898 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.428987 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563762-nk868" event={"ID":"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d","Type":"ContainerStarted","Data":"0d4f08f11dfe1809584f3cab0e3d071c5c19f66ebf3a573a7b3edd6c9c3b4163"} Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.524999 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.879899 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nnzs2"] Mar 18 09:22:02 crc kubenswrapper[4778]: I0318 09:22:02.437862 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nnzs2" event={"ID":"ae95589d-d3fb-4254-9fd0-f59203e0e927","Type":"ContainerStarted","Data":"a2153b59b29a56867da95a8bfa79fff53d74a9d7fba4ed7aa6073d0c04ef364a"} Mar 18 09:22:02 crc kubenswrapper[4778]: I0318 09:22:02.917755 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-dz4jc"] Mar 18 09:22:02 crc kubenswrapper[4778]: I0318 09:22:02.919324 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:02 crc kubenswrapper[4778]: I0318 09:22:02.932665 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dz4jc"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.064593 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5387-account-create-update-wm5k5"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.066101 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.069569 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.073726 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5387-account-create-update-wm5k5"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.135493 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9222d9a-6507-4c32-9234-2c1c2b27a11e-operator-scripts\") pod \"glance-db-create-dz4jc\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.135584 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc2sm\" (UniqueName: \"kubernetes.io/projected/f9222d9a-6507-4c32-9234-2c1c2b27a11e-kube-api-access-bc2sm\") pod \"glance-db-create-dz4jc\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.236769 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9222d9a-6507-4c32-9234-2c1c2b27a11e-operator-scripts\") pod \"glance-db-create-dz4jc\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.236849 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m6wt\" (UniqueName: \"kubernetes.io/projected/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-kube-api-access-8m6wt\") pod \"glance-5387-account-create-update-wm5k5\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.236887 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-operator-scripts\") pod \"glance-5387-account-create-update-wm5k5\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.236942 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc2sm\" (UniqueName: \"kubernetes.io/projected/f9222d9a-6507-4c32-9234-2c1c2b27a11e-kube-api-access-bc2sm\") pod \"glance-db-create-dz4jc\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.238237 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9222d9a-6507-4c32-9234-2c1c2b27a11e-operator-scripts\") pod \"glance-db-create-dz4jc\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.260102 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc2sm\" (UniqueName: \"kubernetes.io/projected/f9222d9a-6507-4c32-9234-2c1c2b27a11e-kube-api-access-bc2sm\") pod \"glance-db-create-dz4jc\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.265352 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.339055 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m6wt\" (UniqueName: \"kubernetes.io/projected/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-kube-api-access-8m6wt\") pod \"glance-5387-account-create-update-wm5k5\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.339127 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-operator-scripts\") pod \"glance-5387-account-create-update-wm5k5\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.340104 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-operator-scripts\") pod \"glance-5387-account-create-update-wm5k5\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.379482 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m6wt\" (UniqueName: \"kubernetes.io/projected/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-kube-api-access-8m6wt\") pod \"glance-5387-account-create-update-wm5k5\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.394111 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.745401 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dz4jc"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.759302 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-l6vl8"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.760819 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.785988 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-l6vl8"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.853286 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfgmc\" (UniqueName: \"kubernetes.io/projected/51a820a6-6a95-4ab7-a9d8-6649fe45464a-kube-api-access-jfgmc\") pod \"keystone-db-create-l6vl8\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.853417 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a820a6-6a95-4ab7-a9d8-6649fe45464a-operator-scripts\") pod \"keystone-db-create-l6vl8\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.855189 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-a222-account-create-update-qr82t"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.856648 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.864498 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.869985 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a222-account-create-update-qr82t"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.908606 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5387-account-create-update-wm5k5"] Mar 18 09:22:03 crc kubenswrapper[4778]: W0318 09:22:03.918360 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod445fcacb_d2c9_4892_89b5_4b2b6e54ebc9.slice/crio-3f56c6b693b7b9b071967115241709056ce764bedf268186ba682ea2bb87e798 WatchSource:0}: Error finding container 3f56c6b693b7b9b071967115241709056ce764bedf268186ba682ea2bb87e798: Status 404 returned error can't find the container with id 3f56c6b693b7b9b071967115241709056ce764bedf268186ba682ea2bb87e798 Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.954886 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a820a6-6a95-4ab7-a9d8-6649fe45464a-operator-scripts\") pod \"keystone-db-create-l6vl8\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.954966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfgmc\" (UniqueName: \"kubernetes.io/projected/51a820a6-6a95-4ab7-a9d8-6649fe45464a-kube-api-access-jfgmc\") pod \"keystone-db-create-l6vl8\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.956495 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a820a6-6a95-4ab7-a9d8-6649fe45464a-operator-scripts\") pod \"keystone-db-create-l6vl8\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.976703 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfgmc\" (UniqueName: \"kubernetes.io/projected/51a820a6-6a95-4ab7-a9d8-6649fe45464a-kube-api-access-jfgmc\") pod \"keystone-db-create-l6vl8\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.050024 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kj8ww"] Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.051516 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.056001 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg6vw\" (UniqueName: \"kubernetes.io/projected/c7ae14ca-efde-42ba-8edf-7cc34dc31036-kube-api-access-fg6vw\") pod \"keystone-a222-account-create-update-qr82t\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.056060 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ae14ca-efde-42ba-8edf-7cc34dc31036-operator-scripts\") pod \"keystone-a222-account-create-update-qr82t\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.063896 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0a46-account-create-update-phb5p"] Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.066333 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.075506 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kj8ww"] Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.085265 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.086111 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.086448 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0a46-account-create-update-phb5p"] Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.161443 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p78gv\" (UniqueName: \"kubernetes.io/projected/24412394-390b-461c-9d18-617eba706adc-kube-api-access-p78gv\") pod \"placement-0a46-account-create-update-phb5p\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.161710 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be311af4-91f5-417e-971b-c9158576ca97-operator-scripts\") pod \"placement-db-create-kj8ww\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.161790 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bfb4\" (UniqueName: \"kubernetes.io/projected/be311af4-91f5-417e-971b-c9158576ca97-kube-api-access-8bfb4\") pod \"placement-db-create-kj8ww\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.162057 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg6vw\" (UniqueName: \"kubernetes.io/projected/c7ae14ca-efde-42ba-8edf-7cc34dc31036-kube-api-access-fg6vw\") pod \"keystone-a222-account-create-update-qr82t\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.162217 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ae14ca-efde-42ba-8edf-7cc34dc31036-operator-scripts\") pod \"keystone-a222-account-create-update-qr82t\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.162302 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24412394-390b-461c-9d18-617eba706adc-operator-scripts\") pod \"placement-0a46-account-create-update-phb5p\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.163407 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ae14ca-efde-42ba-8edf-7cc34dc31036-operator-scripts\") pod \"keystone-a222-account-create-update-qr82t\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.188793 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg6vw\" (UniqueName: \"kubernetes.io/projected/c7ae14ca-efde-42ba-8edf-7cc34dc31036-kube-api-access-fg6vw\") pod \"keystone-a222-account-create-update-qr82t\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.198829 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.266118 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be311af4-91f5-417e-971b-c9158576ca97-operator-scripts\") pod \"placement-db-create-kj8ww\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.268063 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bfb4\" (UniqueName: \"kubernetes.io/projected/be311af4-91f5-417e-971b-c9158576ca97-kube-api-access-8bfb4\") pod \"placement-db-create-kj8ww\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.268313 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24412394-390b-461c-9d18-617eba706adc-operator-scripts\") pod \"placement-0a46-account-create-update-phb5p\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.268610 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p78gv\" (UniqueName: \"kubernetes.io/projected/24412394-390b-461c-9d18-617eba706adc-kube-api-access-p78gv\") pod \"placement-0a46-account-create-update-phb5p\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.269609 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be311af4-91f5-417e-971b-c9158576ca97-operator-scripts\") pod \"placement-db-create-kj8ww\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.270220 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24412394-390b-461c-9d18-617eba706adc-operator-scripts\") pod \"placement-0a46-account-create-update-phb5p\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.304248 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bfb4\" (UniqueName: \"kubernetes.io/projected/be311af4-91f5-417e-971b-c9158576ca97-kube-api-access-8bfb4\") pod \"placement-db-create-kj8ww\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.304996 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p78gv\" (UniqueName: \"kubernetes.io/projected/24412394-390b-461c-9d18-617eba706adc-kube-api-access-p78gv\") pod \"placement-0a46-account-create-update-phb5p\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.391912 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.416099 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.437390 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.496780 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xvr5c"] Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.497641 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" podUID="124c0069-debd-459c-9d66-f38d9d096996" containerName="dnsmasq-dns" containerID="cri-o://1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e" gracePeriod=10 Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.520235 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563762-nk868" event={"ID":"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d","Type":"ContainerStarted","Data":"6d2d7a8a11e12ed4124ccde5834d93c0bc78bbc0a9c88b791847bb709dbbb116"} Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.531160 4778 generic.go:334] "Generic (PLEG): container finished" podID="f9222d9a-6507-4c32-9234-2c1c2b27a11e" containerID="c677c641e634b2b60d1bae546bbf8e8d9cc5553eb522dd4d67a26e72bf3f0752" exitCode=0 Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.531281 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dz4jc" event={"ID":"f9222d9a-6507-4c32-9234-2c1c2b27a11e","Type":"ContainerDied","Data":"c677c641e634b2b60d1bae546bbf8e8d9cc5553eb522dd4d67a26e72bf3f0752"} Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.531321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dz4jc" event={"ID":"f9222d9a-6507-4c32-9234-2c1c2b27a11e","Type":"ContainerStarted","Data":"24b8cc6fbc48de499f68ecf860e8b6053ccab335ce1cff5b8a45ee911c1d9a90"} Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.562298 4778 generic.go:334] "Generic (PLEG): container finished" podID="ae95589d-d3fb-4254-9fd0-f59203e0e927" containerID="1fd591e2b660a21c69fd30f836286a21f17c533f4da08c01dd7acbea44c0d5f9" exitCode=0 Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.562924 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nnzs2" event={"ID":"ae95589d-d3fb-4254-9fd0-f59203e0e927","Type":"ContainerDied","Data":"1fd591e2b660a21c69fd30f836286a21f17c533f4da08c01dd7acbea44c0d5f9"} Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.577980 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5387-account-create-update-wm5k5" event={"ID":"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9","Type":"ContainerStarted","Data":"397a643cfe9ac0a1d5786dfe10182c0fd656474f01a6f126523a52404ea544a7"} Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.578129 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5387-account-create-update-wm5k5" event={"ID":"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9","Type":"ContainerStarted","Data":"3f56c6b693b7b9b071967115241709056ce764bedf268186ba682ea2bb87e798"} Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.663493 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-l6vl8"] Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.057879 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a222-account-create-update-qr82t"] Mar 18 09:22:05 crc kubenswrapper[4778]: W0318 09:22:05.087190 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7ae14ca_efde_42ba_8edf_7cc34dc31036.slice/crio-d1026be7b9a9e421d1dd3fdd40e0dad9e9c0eb2a83f1b8aab937d30a4670c112 WatchSource:0}: Error finding container d1026be7b9a9e421d1dd3fdd40e0dad9e9c0eb2a83f1b8aab937d30a4670c112: Status 404 returned error can't find the container with id d1026be7b9a9e421d1dd3fdd40e0dad9e9c0eb2a83f1b8aab937d30a4670c112 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.214947 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.231019 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kj8ww"] Mar 18 09:22:05 crc kubenswrapper[4778]: W0318 09:22:05.292988 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24412394_390b_461c_9d18_617eba706adc.slice/crio-a8cdfda576bdb60315d008b13d4ba643b525cf074d3f540a2ce9321334d9fb29 WatchSource:0}: Error finding container a8cdfda576bdb60315d008b13d4ba643b525cf074d3f540a2ce9321334d9fb29: Status 404 returned error can't find the container with id a8cdfda576bdb60315d008b13d4ba643b525cf074d3f540a2ce9321334d9fb29 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.295002 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0a46-account-create-update-phb5p"] Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.316880 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chwdv\" (UniqueName: \"kubernetes.io/projected/124c0069-debd-459c-9d66-f38d9d096996-kube-api-access-chwdv\") pod \"124c0069-debd-459c-9d66-f38d9d096996\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.316976 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-config\") pod \"124c0069-debd-459c-9d66-f38d9d096996\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.317073 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-dns-svc\") pod \"124c0069-debd-459c-9d66-f38d9d096996\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.334745 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124c0069-debd-459c-9d66-f38d9d096996-kube-api-access-chwdv" (OuterVolumeSpecName: "kube-api-access-chwdv") pod "124c0069-debd-459c-9d66-f38d9d096996" (UID: "124c0069-debd-459c-9d66-f38d9d096996"). InnerVolumeSpecName "kube-api-access-chwdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.367504 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-config" (OuterVolumeSpecName: "config") pod "124c0069-debd-459c-9d66-f38d9d096996" (UID: "124c0069-debd-459c-9d66-f38d9d096996"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.368757 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "124c0069-debd-459c-9d66-f38d9d096996" (UID: "124c0069-debd-459c-9d66-f38d9d096996"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.418985 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chwdv\" (UniqueName: \"kubernetes.io/projected/124c0069-debd-459c-9d66-f38d9d096996-kube-api-access-chwdv\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.419038 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.419051 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.589228 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l6vl8" event={"ID":"51a820a6-6a95-4ab7-a9d8-6649fe45464a","Type":"ContainerDied","Data":"2d1134d737bb1ad4d1096a8735192a53e75e70709be8071f894f8def68f8db65"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.589052 4778 generic.go:334] "Generic (PLEG): container finished" podID="51a820a6-6a95-4ab7-a9d8-6649fe45464a" containerID="2d1134d737bb1ad4d1096a8735192a53e75e70709be8071f894f8def68f8db65" exitCode=0 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.590350 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l6vl8" event={"ID":"51a820a6-6a95-4ab7-a9d8-6649fe45464a","Type":"ContainerStarted","Data":"c97c18a12d3c090de835642354d9ea6188d81c28ca151b3eb5f41951eda7e87c"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.594272 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bcf145e-ae6a-4674-9f79-b6486ec2fa9d" containerID="6d2d7a8a11e12ed4124ccde5834d93c0bc78bbc0a9c88b791847bb709dbbb116" exitCode=0 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.594711 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563762-nk868" event={"ID":"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d","Type":"ContainerDied","Data":"6d2d7a8a11e12ed4124ccde5834d93c0bc78bbc0a9c88b791847bb709dbbb116"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.597072 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0a46-account-create-update-phb5p" event={"ID":"24412394-390b-461c-9d18-617eba706adc","Type":"ContainerStarted","Data":"a8cdfda576bdb60315d008b13d4ba643b525cf074d3f540a2ce9321334d9fb29"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.604940 4778 generic.go:334] "Generic (PLEG): container finished" podID="124c0069-debd-459c-9d66-f38d9d096996" containerID="1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e" exitCode=0 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.605033 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" event={"ID":"124c0069-debd-459c-9d66-f38d9d096996","Type":"ContainerDied","Data":"1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.605071 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" event={"ID":"124c0069-debd-459c-9d66-f38d9d096996","Type":"ContainerDied","Data":"81379cb4a6ab1c31c5025b2cadc823fb3d4cb38e6f779d5cd8007da3c790b69a"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.605106 4778 scope.go:117] "RemoveContainer" containerID="1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.605793 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.616478 4778 generic.go:334] "Generic (PLEG): container finished" podID="c7ae14ca-efde-42ba-8edf-7cc34dc31036" containerID="70fd7ebf08e80da75227c830c21c112cbd85c345b44bad8cd8c81cd3f4b7fd7e" exitCode=0 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.616555 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a222-account-create-update-qr82t" event={"ID":"c7ae14ca-efde-42ba-8edf-7cc34dc31036","Type":"ContainerDied","Data":"70fd7ebf08e80da75227c830c21c112cbd85c345b44bad8cd8c81cd3f4b7fd7e"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.616590 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a222-account-create-update-qr82t" event={"ID":"c7ae14ca-efde-42ba-8edf-7cc34dc31036","Type":"ContainerStarted","Data":"d1026be7b9a9e421d1dd3fdd40e0dad9e9c0eb2a83f1b8aab937d30a4670c112"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.618703 4778 generic.go:334] "Generic (PLEG): container finished" podID="445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" containerID="397a643cfe9ac0a1d5786dfe10182c0fd656474f01a6f126523a52404ea544a7" exitCode=0 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.618801 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5387-account-create-update-wm5k5" event={"ID":"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9","Type":"ContainerDied","Data":"397a643cfe9ac0a1d5786dfe10182c0fd656474f01a6f126523a52404ea544a7"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.623359 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kj8ww" event={"ID":"be311af4-91f5-417e-971b-c9158576ca97","Type":"ContainerStarted","Data":"9b6fa295a9bfec83f890b1dd7210afd8d93f1d1f4da240bfbc29bf8af750edd0"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.623408 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kj8ww" event={"ID":"be311af4-91f5-417e-971b-c9158576ca97","Type":"ContainerStarted","Data":"ea80e6972d772d1aea12da8304c45c18840c5828e5180b99ae338ff0921eae08"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.651783 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-0a46-account-create-update-phb5p" podStartSLOduration=1.651752208 podStartE2EDuration="1.651752208s" podCreationTimestamp="2026-03-18 09:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:05.629135051 +0000 UTC m=+1192.203879911" watchObservedRunningTime="2026-03-18 09:22:05.651752208 +0000 UTC m=+1192.226497048" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.653819 4778 scope.go:117] "RemoveContainer" containerID="5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.675419 4778 scope.go:117] "RemoveContainer" containerID="1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e" Mar 18 09:22:05 crc kubenswrapper[4778]: E0318 09:22:05.679318 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e\": container with ID starting with 1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e not found: ID does not exist" containerID="1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.679358 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e"} err="failed to get container status \"1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e\": rpc error: code = NotFound desc = could not find container \"1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e\": container with ID starting with 1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e not found: ID does not exist" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.679382 4778 scope.go:117] "RemoveContainer" containerID="5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.686463 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xvr5c"] Mar 18 09:22:05 crc kubenswrapper[4778]: E0318 09:22:05.695808 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c\": container with ID starting with 5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c not found: ID does not exist" containerID="5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.695860 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c"} err="failed to get container status \"5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c\": rpc error: code = NotFound desc = could not find container \"5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c\": container with ID starting with 5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c not found: ID does not exist" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.708127 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xvr5c"] Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.712399 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-kj8ww" podStartSLOduration=1.712320639 podStartE2EDuration="1.712320639s" podCreationTimestamp="2026-03-18 09:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:05.675952887 +0000 UTC m=+1192.250697737" watchObservedRunningTime="2026-03-18 09:22:05.712320639 +0000 UTC m=+1192.287065479" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.948029 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.027784 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhg95\" (UniqueName: \"kubernetes.io/projected/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d-kube-api-access-dhg95\") pod \"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d\" (UID: \"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.033925 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d-kube-api-access-dhg95" (OuterVolumeSpecName: "kube-api-access-dhg95") pod "7bcf145e-ae6a-4674-9f79-b6486ec2fa9d" (UID: "7bcf145e-ae6a-4674-9f79-b6486ec2fa9d"). InnerVolumeSpecName "kube-api-access-dhg95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.129325 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhg95\" (UniqueName: \"kubernetes.io/projected/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d-kube-api-access-dhg95\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.154884 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.165805 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.181460 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.204172 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124c0069-debd-459c-9d66-f38d9d096996" path="/var/lib/kubelet/pods/124c0069-debd-459c-9d66-f38d9d096996/volumes" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.232615 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc2sm\" (UniqueName: \"kubernetes.io/projected/f9222d9a-6507-4c32-9234-2c1c2b27a11e-kube-api-access-bc2sm\") pod \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.232756 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m6wt\" (UniqueName: \"kubernetes.io/projected/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-kube-api-access-8m6wt\") pod \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.232833 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae95589d-d3fb-4254-9fd0-f59203e0e927-operator-scripts\") pod \"ae95589d-d3fb-4254-9fd0-f59203e0e927\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.232874 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tmwp\" (UniqueName: \"kubernetes.io/projected/ae95589d-d3fb-4254-9fd0-f59203e0e927-kube-api-access-8tmwp\") pod \"ae95589d-d3fb-4254-9fd0-f59203e0e927\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.232929 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-operator-scripts\") pod \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.232954 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9222d9a-6507-4c32-9234-2c1c2b27a11e-operator-scripts\") pod \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.233870 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9222d9a-6507-4c32-9234-2c1c2b27a11e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9222d9a-6507-4c32-9234-2c1c2b27a11e" (UID: "f9222d9a-6507-4c32-9234-2c1c2b27a11e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.234258 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" (UID: "445fcacb-d2c9-4892-89b5-4b2b6e54ebc9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.234652 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae95589d-d3fb-4254-9fd0-f59203e0e927-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae95589d-d3fb-4254-9fd0-f59203e0e927" (UID: "ae95589d-d3fb-4254-9fd0-f59203e0e927"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.236752 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-kube-api-access-8m6wt" (OuterVolumeSpecName: "kube-api-access-8m6wt") pod "445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" (UID: "445fcacb-d2c9-4892-89b5-4b2b6e54ebc9"). InnerVolumeSpecName "kube-api-access-8m6wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.237406 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9222d9a-6507-4c32-9234-2c1c2b27a11e-kube-api-access-bc2sm" (OuterVolumeSpecName: "kube-api-access-bc2sm") pod "f9222d9a-6507-4c32-9234-2c1c2b27a11e" (UID: "f9222d9a-6507-4c32-9234-2c1c2b27a11e"). InnerVolumeSpecName "kube-api-access-bc2sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.237693 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae95589d-d3fb-4254-9fd0-f59203e0e927-kube-api-access-8tmwp" (OuterVolumeSpecName: "kube-api-access-8tmwp") pod "ae95589d-d3fb-4254-9fd0-f59203e0e927" (UID: "ae95589d-d3fb-4254-9fd0-f59203e0e927"). InnerVolumeSpecName "kube-api-access-8tmwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.335342 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.335386 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9222d9a-6507-4c32-9234-2c1c2b27a11e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.335399 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc2sm\" (UniqueName: \"kubernetes.io/projected/f9222d9a-6507-4c32-9234-2c1c2b27a11e-kube-api-access-bc2sm\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.335412 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m6wt\" (UniqueName: \"kubernetes.io/projected/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-kube-api-access-8m6wt\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.335427 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae95589d-d3fb-4254-9fd0-f59203e0e927-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.335438 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tmwp\" (UniqueName: \"kubernetes.io/projected/ae95589d-d3fb-4254-9fd0-f59203e0e927-kube-api-access-8tmwp\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.636577 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563762-nk868" event={"ID":"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d","Type":"ContainerDied","Data":"0d4f08f11dfe1809584f3cab0e3d071c5c19f66ebf3a573a7b3edd6c9c3b4163"} Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.637105 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d4f08f11dfe1809584f3cab0e3d071c5c19f66ebf3a573a7b3edd6c9c3b4163" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.637249 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.641014 4778 generic.go:334] "Generic (PLEG): container finished" podID="24412394-390b-461c-9d18-617eba706adc" containerID="39a152a6bb8ed07675c14ece0ad21851da7a9e9a103afe2312ea0254cd99b29c" exitCode=0 Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.641276 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0a46-account-create-update-phb5p" event={"ID":"24412394-390b-461c-9d18-617eba706adc","Type":"ContainerDied","Data":"39a152a6bb8ed07675c14ece0ad21851da7a9e9a103afe2312ea0254cd99b29c"} Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.645569 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dz4jc" event={"ID":"f9222d9a-6507-4c32-9234-2c1c2b27a11e","Type":"ContainerDied","Data":"24b8cc6fbc48de499f68ecf860e8b6053ccab335ce1cff5b8a45ee911c1d9a90"} Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.645597 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b8cc6fbc48de499f68ecf860e8b6053ccab335ce1cff5b8a45ee911c1d9a90" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.645642 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.647390 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nnzs2" event={"ID":"ae95589d-d3fb-4254-9fd0-f59203e0e927","Type":"ContainerDied","Data":"a2153b59b29a56867da95a8bfa79fff53d74a9d7fba4ed7aa6073d0c04ef364a"} Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.647415 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2153b59b29a56867da95a8bfa79fff53d74a9d7fba4ed7aa6073d0c04ef364a" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.647478 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.663570 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.664032 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5387-account-create-update-wm5k5" event={"ID":"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9","Type":"ContainerDied","Data":"3f56c6b693b7b9b071967115241709056ce764bedf268186ba682ea2bb87e798"} Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.664073 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f56c6b693b7b9b071967115241709056ce764bedf268186ba682ea2bb87e798" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.666328 4778 generic.go:334] "Generic (PLEG): container finished" podID="be311af4-91f5-417e-971b-c9158576ca97" containerID="9b6fa295a9bfec83f890b1dd7210afd8d93f1d1f4da240bfbc29bf8af750edd0" exitCode=0 Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.666383 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kj8ww" event={"ID":"be311af4-91f5-417e-971b-c9158576ca97","Type":"ContainerDied","Data":"9b6fa295a9bfec83f890b1dd7210afd8d93f1d1f4da240bfbc29bf8af750edd0"} Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.024723 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.032610 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563756-s8bkt"] Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.039491 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563756-s8bkt"] Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.049426 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg6vw\" (UniqueName: \"kubernetes.io/projected/c7ae14ca-efde-42ba-8edf-7cc34dc31036-kube-api-access-fg6vw\") pod \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.049745 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ae14ca-efde-42ba-8edf-7cc34dc31036-operator-scripts\") pod \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.051249 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ae14ca-efde-42ba-8edf-7cc34dc31036-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7ae14ca-efde-42ba-8edf-7cc34dc31036" (UID: "c7ae14ca-efde-42ba-8edf-7cc34dc31036"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.075032 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ae14ca-efde-42ba-8edf-7cc34dc31036-kube-api-access-fg6vw" (OuterVolumeSpecName: "kube-api-access-fg6vw") pod "c7ae14ca-efde-42ba-8edf-7cc34dc31036" (UID: "c7ae14ca-efde-42ba-8edf-7cc34dc31036"). InnerVolumeSpecName "kube-api-access-fg6vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.128964 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.151724 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a820a6-6a95-4ab7-a9d8-6649fe45464a-operator-scripts\") pod \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.151875 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfgmc\" (UniqueName: \"kubernetes.io/projected/51a820a6-6a95-4ab7-a9d8-6649fe45464a-kube-api-access-jfgmc\") pod \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.152230 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ae14ca-efde-42ba-8edf-7cc34dc31036-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.152242 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg6vw\" (UniqueName: \"kubernetes.io/projected/c7ae14ca-efde-42ba-8edf-7cc34dc31036-kube-api-access-fg6vw\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.152660 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a820a6-6a95-4ab7-a9d8-6649fe45464a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51a820a6-6a95-4ab7-a9d8-6649fe45464a" (UID: "51a820a6-6a95-4ab7-a9d8-6649fe45464a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.156528 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a820a6-6a95-4ab7-a9d8-6649fe45464a-kube-api-access-jfgmc" (OuterVolumeSpecName: "kube-api-access-jfgmc") pod "51a820a6-6a95-4ab7-a9d8-6649fe45464a" (UID: "51a820a6-6a95-4ab7-a9d8-6649fe45464a"). InnerVolumeSpecName "kube-api-access-jfgmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.254160 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfgmc\" (UniqueName: \"kubernetes.io/projected/51a820a6-6a95-4ab7-a9d8-6649fe45464a-kube-api-access-jfgmc\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.254227 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a820a6-6a95-4ab7-a9d8-6649fe45464a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.679267 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l6vl8" event={"ID":"51a820a6-6a95-4ab7-a9d8-6649fe45464a","Type":"ContainerDied","Data":"c97c18a12d3c090de835642354d9ea6188d81c28ca151b3eb5f41951eda7e87c"} Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.679602 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c97c18a12d3c090de835642354d9ea6188d81c28ca151b3eb5f41951eda7e87c" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.679679 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.682313 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a222-account-create-update-qr82t" event={"ID":"c7ae14ca-efde-42ba-8edf-7cc34dc31036","Type":"ContainerDied","Data":"d1026be7b9a9e421d1dd3fdd40e0dad9e9c0eb2a83f1b8aab937d30a4670c112"} Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.682382 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1026be7b9a9e421d1dd3fdd40e0dad9e9c0eb2a83f1b8aab937d30a4670c112" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.682471 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.014285 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.067059 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p78gv\" (UniqueName: \"kubernetes.io/projected/24412394-390b-461c-9d18-617eba706adc-kube-api-access-p78gv\") pod \"24412394-390b-461c-9d18-617eba706adc\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.067438 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24412394-390b-461c-9d18-617eba706adc-operator-scripts\") pod \"24412394-390b-461c-9d18-617eba706adc\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.067874 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24412394-390b-461c-9d18-617eba706adc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24412394-390b-461c-9d18-617eba706adc" (UID: "24412394-390b-461c-9d18-617eba706adc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.085374 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24412394-390b-461c-9d18-617eba706adc-kube-api-access-p78gv" (OuterVolumeSpecName: "kube-api-access-p78gv") pod "24412394-390b-461c-9d18-617eba706adc" (UID: "24412394-390b-461c-9d18-617eba706adc"). InnerVolumeSpecName "kube-api-access-p78gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.169500 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24412394-390b-461c-9d18-617eba706adc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.169528 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p78gv\" (UniqueName: \"kubernetes.io/projected/24412394-390b-461c-9d18-617eba706adc-kube-api-access-p78gv\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.196488 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" path="/var/lib/kubelet/pods/a6298370-ed2e-4705-827b-c1a77b03f32a/volumes" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.430026 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.534623 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-b66ph"] Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535024 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a820a6-6a95-4ab7-a9d8-6649fe45464a" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535044 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a820a6-6a95-4ab7-a9d8-6649fe45464a" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535078 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be311af4-91f5-417e-971b-c9158576ca97" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535087 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be311af4-91f5-417e-971b-c9158576ca97" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535097 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9222d9a-6507-4c32-9234-2c1c2b27a11e" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535105 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9222d9a-6507-4c32-9234-2c1c2b27a11e" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535123 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae95589d-d3fb-4254-9fd0-f59203e0e927" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535131 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae95589d-d3fb-4254-9fd0-f59203e0e927" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535147 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ae14ca-efde-42ba-8edf-7cc34dc31036" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535155 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ae14ca-efde-42ba-8edf-7cc34dc31036" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535178 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124c0069-debd-459c-9d66-f38d9d096996" containerName="init" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535185 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="124c0069-debd-459c-9d66-f38d9d096996" containerName="init" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535215 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bcf145e-ae6a-4674-9f79-b6486ec2fa9d" containerName="oc" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535223 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bcf145e-ae6a-4674-9f79-b6486ec2fa9d" containerName="oc" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535239 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535247 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535261 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24412394-390b-461c-9d18-617eba706adc" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535269 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="24412394-390b-461c-9d18-617eba706adc" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535279 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124c0069-debd-459c-9d66-f38d9d096996" containerName="dnsmasq-dns" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535287 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="124c0069-debd-459c-9d66-f38d9d096996" containerName="dnsmasq-dns" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535462 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535479 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9222d9a-6507-4c32-9234-2c1c2b27a11e" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535492 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a820a6-6a95-4ab7-a9d8-6649fe45464a" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535502 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="24412394-390b-461c-9d18-617eba706adc" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535515 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ae14ca-efde-42ba-8edf-7cc34dc31036" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535524 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae95589d-d3fb-4254-9fd0-f59203e0e927" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535532 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="be311af4-91f5-417e-971b-c9158576ca97" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535542 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="124c0069-debd-459c-9d66-f38d9d096996" containerName="dnsmasq-dns" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535553 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bcf145e-ae6a-4674-9f79-b6486ec2fa9d" containerName="oc" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.536158 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.541551 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.541666 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ntc8r" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.564640 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b66ph"] Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.591981 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be311af4-91f5-417e-971b-c9158576ca97-operator-scripts\") pod \"be311af4-91f5-417e-971b-c9158576ca97\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.592169 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bfb4\" (UniqueName: \"kubernetes.io/projected/be311af4-91f5-417e-971b-c9158576ca97-kube-api-access-8bfb4\") pod \"be311af4-91f5-417e-971b-c9158576ca97\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.601760 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be311af4-91f5-417e-971b-c9158576ca97-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be311af4-91f5-417e-971b-c9158576ca97" (UID: "be311af4-91f5-417e-971b-c9158576ca97"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.610491 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be311af4-91f5-417e-971b-c9158576ca97-kube-api-access-8bfb4" (OuterVolumeSpecName: "kube-api-access-8bfb4") pod "be311af4-91f5-417e-971b-c9158576ca97" (UID: "be311af4-91f5-417e-971b-c9158576ca97"). InnerVolumeSpecName "kube-api-access-8bfb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.692483 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0a46-account-create-update-phb5p" event={"ID":"24412394-390b-461c-9d18-617eba706adc","Type":"ContainerDied","Data":"a8cdfda576bdb60315d008b13d4ba643b525cf074d3f540a2ce9321334d9fb29"} Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.692509 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.692525 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8cdfda576bdb60315d008b13d4ba643b525cf074d3f540a2ce9321334d9fb29" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694094 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-combined-ca-bundle\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694153 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kj8ww" event={"ID":"be311af4-91f5-417e-971b-c9158576ca97","Type":"ContainerDied","Data":"ea80e6972d772d1aea12da8304c45c18840c5828e5180b99ae338ff0921eae08"} Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694174 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c69rn\" (UniqueName: \"kubernetes.io/projected/5dadb643-21f7-497a-992f-41ab80c704c5-kube-api-access-c69rn\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694235 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-db-sync-config-data\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694255 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-config-data\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694265 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694316 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be311af4-91f5-417e-971b-c9158576ca97-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694331 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bfb4\" (UniqueName: \"kubernetes.io/projected/be311af4-91f5-417e-971b-c9158576ca97-kube-api-access-8bfb4\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694174 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea80e6972d772d1aea12da8304c45c18840c5828e5180b99ae338ff0921eae08" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.795821 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-db-sync-config-data\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.796164 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-config-data\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.796273 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-combined-ca-bundle\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.796318 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c69rn\" (UniqueName: \"kubernetes.io/projected/5dadb643-21f7-497a-992f-41ab80c704c5-kube-api-access-c69rn\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.800655 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-db-sync-config-data\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.800708 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-config-data\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.803267 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-combined-ca-bundle\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.817447 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c69rn\" (UniqueName: \"kubernetes.io/projected/5dadb643-21f7-497a-992f-41ab80c704c5-kube-api-access-c69rn\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.875838 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.517780 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b66ph"] Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.536904 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.700225 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-nnzs2"] Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.702263 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b66ph" event={"ID":"5dadb643-21f7-497a-992f-41ab80c704c5","Type":"ContainerStarted","Data":"2d28880ba4925c32ee75f685408b2e6019fc2833229df2500837da232ac04367"} Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.709387 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-nnzs2"] Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.797134 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bmx7j"] Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.798348 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.807065 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.807515 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bmx7j"] Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.942981 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c364f41-96b2-472b-bf96-fbbe1c1c5515-operator-scripts\") pod \"root-account-create-update-bmx7j\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.943063 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-569bc\" (UniqueName: \"kubernetes.io/projected/3c364f41-96b2-472b-bf96-fbbe1c1c5515-kube-api-access-569bc\") pod \"root-account-create-update-bmx7j\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.044136 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c364f41-96b2-472b-bf96-fbbe1c1c5515-operator-scripts\") pod \"root-account-create-update-bmx7j\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.044245 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-569bc\" (UniqueName: \"kubernetes.io/projected/3c364f41-96b2-472b-bf96-fbbe1c1c5515-kube-api-access-569bc\") pod \"root-account-create-update-bmx7j\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.045345 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c364f41-96b2-472b-bf96-fbbe1c1c5515-operator-scripts\") pod \"root-account-create-update-bmx7j\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.077139 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-569bc\" (UniqueName: \"kubernetes.io/projected/3c364f41-96b2-472b-bf96-fbbe1c1c5515-kube-api-access-569bc\") pod \"root-account-create-update-bmx7j\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.152386 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.204397 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae95589d-d3fb-4254-9fd0-f59203e0e927" path="/var/lib/kubelet/pods/ae95589d-d3fb-4254-9fd0-f59203e0e927/volumes" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.601920 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bmx7j"] Mar 18 09:22:10 crc kubenswrapper[4778]: W0318 09:22:10.607081 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c364f41_96b2_472b_bf96_fbbe1c1c5515.slice/crio-4c38f2389e7b143602b7ffeca1f041770927d65b1eec4410ead1aa8f68aa1bf5 WatchSource:0}: Error finding container 4c38f2389e7b143602b7ffeca1f041770927d65b1eec4410ead1aa8f68aa1bf5: Status 404 returned error can't find the container with id 4c38f2389e7b143602b7ffeca1f041770927d65b1eec4410ead1aa8f68aa1bf5 Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.711634 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bmx7j" event={"ID":"3c364f41-96b2-472b-bf96-fbbe1c1c5515","Type":"ContainerStarted","Data":"4c38f2389e7b143602b7ffeca1f041770927d65b1eec4410ead1aa8f68aa1bf5"} Mar 18 09:22:11 crc kubenswrapper[4778]: I0318 09:22:11.750609 4778 generic.go:334] "Generic (PLEG): container finished" podID="3c364f41-96b2-472b-bf96-fbbe1c1c5515" containerID="d4dc8e8b710d4699b5bf32fb7126e48abd2e3887409f25b1985152911c6485a4" exitCode=0 Mar 18 09:22:11 crc kubenswrapper[4778]: I0318 09:22:11.750707 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bmx7j" event={"ID":"3c364f41-96b2-472b-bf96-fbbe1c1c5515","Type":"ContainerDied","Data":"d4dc8e8b710d4699b5bf32fb7126e48abd2e3887409f25b1985152911c6485a4"} Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.151429 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.307737 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c364f41-96b2-472b-bf96-fbbe1c1c5515-operator-scripts\") pod \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.307877 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-569bc\" (UniqueName: \"kubernetes.io/projected/3c364f41-96b2-472b-bf96-fbbe1c1c5515-kube-api-access-569bc\") pod \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.309191 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c364f41-96b2-472b-bf96-fbbe1c1c5515-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c364f41-96b2-472b-bf96-fbbe1c1c5515" (UID: "3c364f41-96b2-472b-bf96-fbbe1c1c5515"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.330032 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c364f41-96b2-472b-bf96-fbbe1c1c5515-kube-api-access-569bc" (OuterVolumeSpecName: "kube-api-access-569bc") pod "3c364f41-96b2-472b-bf96-fbbe1c1c5515" (UID: "3c364f41-96b2-472b-bf96-fbbe1c1c5515"). InnerVolumeSpecName "kube-api-access-569bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.409936 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c364f41-96b2-472b-bf96-fbbe1c1c5515-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.409970 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-569bc\" (UniqueName: \"kubernetes.io/projected/3c364f41-96b2-472b-bf96-fbbe1c1c5515-kube-api-access-569bc\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.780835 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bmx7j" event={"ID":"3c364f41-96b2-472b-bf96-fbbe1c1c5515","Type":"ContainerDied","Data":"4c38f2389e7b143602b7ffeca1f041770927d65b1eec4410ead1aa8f68aa1bf5"} Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.780895 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.780913 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c38f2389e7b143602b7ffeca1f041770927d65b1eec4410ead1aa8f68aa1bf5" Mar 18 09:22:14 crc kubenswrapper[4778]: I0318 09:22:14.577511 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 09:22:17 crc kubenswrapper[4778]: I0318 09:22:17.423824 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" podUID="208b26f2-3c91-4966-9d01-8fe73e4a7d87" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.88:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 09:22:17 crc kubenswrapper[4778]: I0318 09:22:17.534285 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bmx7j"] Mar 18 09:22:17 crc kubenswrapper[4778]: I0318 09:22:17.542003 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bmx7j"] Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.198700 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c364f41-96b2-472b-bf96-fbbe1c1c5515" path="/var/lib/kubelet/pods/3c364f41-96b2-472b-bf96-fbbe1c1c5515/volumes" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.376422 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-djmq6" podUID="f58533cf-4c57-4c3a-b772-e2a488298d7e" containerName="ovn-controller" probeResult="failure" output=< Mar 18 09:22:18 crc kubenswrapper[4778]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 09:22:18 crc kubenswrapper[4778]: > Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.415055 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.455051 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.779413 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-djmq6-config-6ccgx"] Mar 18 09:22:18 crc kubenswrapper[4778]: E0318 09:22:18.779784 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c364f41-96b2-472b-bf96-fbbe1c1c5515" containerName="mariadb-account-create-update" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.779800 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c364f41-96b2-472b-bf96-fbbe1c1c5515" containerName="mariadb-account-create-update" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.779962 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c364f41-96b2-472b-bf96-fbbe1c1c5515" containerName="mariadb-account-create-update" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.780505 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.782977 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.805220 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djmq6-config-6ccgx"] Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.906526 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-scripts\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.906629 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-additional-scripts\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.906681 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw4kw\" (UniqueName: \"kubernetes.io/projected/20e799eb-9c49-4025-98e1-b25be1bac66a-kube-api-access-rw4kw\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.906732 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.906804 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run-ovn\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.906866 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-log-ovn\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008473 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-additional-scripts\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008549 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw4kw\" (UniqueName: \"kubernetes.io/projected/20e799eb-9c49-4025-98e1-b25be1bac66a-kube-api-access-rw4kw\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008602 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008635 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run-ovn\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008690 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-log-ovn\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008783 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-scripts\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008937 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-log-ovn\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008960 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.009088 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run-ovn\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.009463 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-additional-scripts\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.012058 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-scripts\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.037334 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw4kw\" (UniqueName: \"kubernetes.io/projected/20e799eb-9c49-4025-98e1-b25be1bac66a-kube-api-access-rw4kw\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.125524 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:21 crc kubenswrapper[4778]: I0318 09:22:21.541333 4778 generic.go:334] "Generic (PLEG): container finished" podID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerID="3698e124eba53a15e3f16dfe6346805545443e4b8ce94d12254d508326508979" exitCode=0 Mar 18 09:22:21 crc kubenswrapper[4778]: I0318 09:22:21.541429 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57955df9-f0c5-4cfc-91fd-135771be7ed2","Type":"ContainerDied","Data":"3698e124eba53a15e3f16dfe6346805545443e4b8ce94d12254d508326508979"} Mar 18 09:22:21 crc kubenswrapper[4778]: I0318 09:22:21.544302 4778 generic.go:334] "Generic (PLEG): container finished" podID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerID="fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7" exitCode=0 Mar 18 09:22:21 crc kubenswrapper[4778]: I0318 09:22:21.544333 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"671fe1be-f3dd-475e-8c48-a1d1db510aef","Type":"ContainerDied","Data":"fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7"} Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.536498 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zrjls"] Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.537882 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.539598 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.547135 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zrjls"] Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.575862 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7dnt\" (UniqueName: \"kubernetes.io/projected/8560ebac-334f-4332-b324-cdb297a94b1a-kube-api-access-v7dnt\") pod \"root-account-create-update-zrjls\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.575993 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8560ebac-334f-4332-b324-cdb297a94b1a-operator-scripts\") pod \"root-account-create-update-zrjls\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.677501 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7dnt\" (UniqueName: \"kubernetes.io/projected/8560ebac-334f-4332-b324-cdb297a94b1a-kube-api-access-v7dnt\") pod \"root-account-create-update-zrjls\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.678017 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8560ebac-334f-4332-b324-cdb297a94b1a-operator-scripts\") pod \"root-account-create-update-zrjls\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.678839 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8560ebac-334f-4332-b324-cdb297a94b1a-operator-scripts\") pod \"root-account-create-update-zrjls\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.699453 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7dnt\" (UniqueName: \"kubernetes.io/projected/8560ebac-334f-4332-b324-cdb297a94b1a-kube-api-access-v7dnt\") pod \"root-account-create-update-zrjls\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.856222 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:23 crc kubenswrapper[4778]: I0318 09:22:23.369805 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-djmq6" podUID="f58533cf-4c57-4c3a-b772-e2a488298d7e" containerName="ovn-controller" probeResult="failure" output=< Mar 18 09:22:23 crc kubenswrapper[4778]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 09:22:23 crc kubenswrapper[4778]: > Mar 18 09:22:25 crc kubenswrapper[4778]: I0318 09:22:25.738080 4778 scope.go:117] "RemoveContainer" containerID="4909b98cff116d6eb4c151d4ba3b46f1a567c070f760a907d7a4e8ea4dca9196" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.180369 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djmq6-config-6ccgx"] Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.380895 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zrjls"] Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.600310 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zrjls" event={"ID":"8560ebac-334f-4332-b324-cdb297a94b1a","Type":"ContainerStarted","Data":"8485de1959de5e473a1a0282e19bec9c8061e5419357cbeb799b7cc895e3b146"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.600835 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zrjls" event={"ID":"8560ebac-334f-4332-b324-cdb297a94b1a","Type":"ContainerStarted","Data":"b8c940ffd2808689a9da3bc2d7f8c95ee43229d10a3c9d4eb5fdacb3256bde90"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.601979 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b66ph" event={"ID":"5dadb643-21f7-497a-992f-41ab80c704c5","Type":"ContainerStarted","Data":"0cbb671a57344b775f4ff6d3749d585e5e115edb6d8d987453203f44ff882ff0"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.604893 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57955df9-f0c5-4cfc-91fd-135771be7ed2","Type":"ContainerStarted","Data":"29a1b3eb3844657a811126fd222acecd2b3045c5c60889965b169deb2aec2c9a"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.605330 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.607690 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"671fe1be-f3dd-475e-8c48-a1d1db510aef","Type":"ContainerStarted","Data":"4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.608227 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.612626 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djmq6-config-6ccgx" event={"ID":"20e799eb-9c49-4025-98e1-b25be1bac66a","Type":"ContainerStarted","Data":"a91ae12327b49c9ef11425b6654b97316a81c22c90fc2a91ca50368382d18bdb"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.612661 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djmq6-config-6ccgx" event={"ID":"20e799eb-9c49-4025-98e1-b25be1bac66a","Type":"ContainerStarted","Data":"4282be7a65cf991e1b8ae29fb89dbb383c0cb63e6bfe1799f15c822d2653961d"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.628344 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-zrjls" podStartSLOduration=5.628323363 podStartE2EDuration="5.628323363s" podCreationTimestamp="2026-03-18 09:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:27.618573097 +0000 UTC m=+1214.193317947" watchObservedRunningTime="2026-03-18 09:22:27.628323363 +0000 UTC m=+1214.203068203" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.650619 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-djmq6-config-6ccgx" podStartSLOduration=9.650598391 podStartE2EDuration="9.650598391s" podCreationTimestamp="2026-03-18 09:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:27.644307438 +0000 UTC m=+1214.219052288" watchObservedRunningTime="2026-03-18 09:22:27.650598391 +0000 UTC m=+1214.225343231" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.674156 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=58.469613519 podStartE2EDuration="1m9.674129852s" podCreationTimestamp="2026-03-18 09:21:18 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.491991181 +0000 UTC m=+1162.066736021" lastFinishedPulling="2026-03-18 09:21:46.696507514 +0000 UTC m=+1173.271252354" observedRunningTime="2026-03-18 09:22:27.668674453 +0000 UTC m=+1214.243419293" watchObservedRunningTime="2026-03-18 09:22:27.674129852 +0000 UTC m=+1214.248874692" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.705144 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=58.863623155 podStartE2EDuration="1m9.705118937s" podCreationTimestamp="2026-03-18 09:21:18 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.72219076 +0000 UTC m=+1162.296935600" lastFinishedPulling="2026-03-18 09:21:46.563686552 +0000 UTC m=+1173.138431382" observedRunningTime="2026-03-18 09:22:27.696737979 +0000 UTC m=+1214.271482829" watchObservedRunningTime="2026-03-18 09:22:27.705118937 +0000 UTC m=+1214.279863767" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.735458 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-b66ph" podStartSLOduration=2.339762672 podStartE2EDuration="19.735433585s" podCreationTimestamp="2026-03-18 09:22:08 +0000 UTC" firstStartedPulling="2026-03-18 09:22:09.536543332 +0000 UTC m=+1196.111288162" lastFinishedPulling="2026-03-18 09:22:26.932214225 +0000 UTC m=+1213.506959075" observedRunningTime="2026-03-18 09:22:27.728281009 +0000 UTC m=+1214.303025859" watchObservedRunningTime="2026-03-18 09:22:27.735433585 +0000 UTC m=+1214.310178455" Mar 18 09:22:28 crc kubenswrapper[4778]: I0318 09:22:28.455656 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-djmq6" Mar 18 09:22:28 crc kubenswrapper[4778]: I0318 09:22:28.623750 4778 generic.go:334] "Generic (PLEG): container finished" podID="8560ebac-334f-4332-b324-cdb297a94b1a" containerID="8485de1959de5e473a1a0282e19bec9c8061e5419357cbeb799b7cc895e3b146" exitCode=0 Mar 18 09:22:28 crc kubenswrapper[4778]: I0318 09:22:28.623843 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zrjls" event={"ID":"8560ebac-334f-4332-b324-cdb297a94b1a","Type":"ContainerDied","Data":"8485de1959de5e473a1a0282e19bec9c8061e5419357cbeb799b7cc895e3b146"} Mar 18 09:22:28 crc kubenswrapper[4778]: I0318 09:22:28.626132 4778 generic.go:334] "Generic (PLEG): container finished" podID="20e799eb-9c49-4025-98e1-b25be1bac66a" containerID="a91ae12327b49c9ef11425b6654b97316a81c22c90fc2a91ca50368382d18bdb" exitCode=0 Mar 18 09:22:28 crc kubenswrapper[4778]: I0318 09:22:28.626229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djmq6-config-6ccgx" event={"ID":"20e799eb-9c49-4025-98e1-b25be1bac66a","Type":"ContainerDied","Data":"a91ae12327b49c9ef11425b6654b97316a81c22c90fc2a91ca50368382d18bdb"} Mar 18 09:22:30 crc kubenswrapper[4778]: I0318 09:22:30.147161 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:22:30 crc kubenswrapper[4778]: I0318 09:22:30.147533 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.022579 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.030761 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135022 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw4kw\" (UniqueName: \"kubernetes.io/projected/20e799eb-9c49-4025-98e1-b25be1bac66a-kube-api-access-rw4kw\") pod \"20e799eb-9c49-4025-98e1-b25be1bac66a\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135094 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run-ovn\") pod \"20e799eb-9c49-4025-98e1-b25be1bac66a\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135121 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-log-ovn\") pod \"20e799eb-9c49-4025-98e1-b25be1bac66a\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135258 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7dnt\" (UniqueName: \"kubernetes.io/projected/8560ebac-334f-4332-b324-cdb297a94b1a-kube-api-access-v7dnt\") pod \"8560ebac-334f-4332-b324-cdb297a94b1a\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135332 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-scripts\") pod \"20e799eb-9c49-4025-98e1-b25be1bac66a\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135370 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run\") pod \"20e799eb-9c49-4025-98e1-b25be1bac66a\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135414 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8560ebac-334f-4332-b324-cdb297a94b1a-operator-scripts\") pod \"8560ebac-334f-4332-b324-cdb297a94b1a\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135447 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-additional-scripts\") pod \"20e799eb-9c49-4025-98e1-b25be1bac66a\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.136709 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "20e799eb-9c49-4025-98e1-b25be1bac66a" (UID: "20e799eb-9c49-4025-98e1-b25be1bac66a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.138325 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "20e799eb-9c49-4025-98e1-b25be1bac66a" (UID: "20e799eb-9c49-4025-98e1-b25be1bac66a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.138363 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "20e799eb-9c49-4025-98e1-b25be1bac66a" (UID: "20e799eb-9c49-4025-98e1-b25be1bac66a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.138397 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run" (OuterVolumeSpecName: "var-run") pod "20e799eb-9c49-4025-98e1-b25be1bac66a" (UID: "20e799eb-9c49-4025-98e1-b25be1bac66a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.138918 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8560ebac-334f-4332-b324-cdb297a94b1a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8560ebac-334f-4332-b324-cdb297a94b1a" (UID: "8560ebac-334f-4332-b324-cdb297a94b1a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.139328 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-scripts" (OuterVolumeSpecName: "scripts") pod "20e799eb-9c49-4025-98e1-b25be1bac66a" (UID: "20e799eb-9c49-4025-98e1-b25be1bac66a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.157334 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8560ebac-334f-4332-b324-cdb297a94b1a-kube-api-access-v7dnt" (OuterVolumeSpecName: "kube-api-access-v7dnt") pod "8560ebac-334f-4332-b324-cdb297a94b1a" (UID: "8560ebac-334f-4332-b324-cdb297a94b1a"). InnerVolumeSpecName "kube-api-access-v7dnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.170449 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e799eb-9c49-4025-98e1-b25be1bac66a-kube-api-access-rw4kw" (OuterVolumeSpecName: "kube-api-access-rw4kw") pod "20e799eb-9c49-4025-98e1-b25be1bac66a" (UID: "20e799eb-9c49-4025-98e1-b25be1bac66a"). InnerVolumeSpecName "kube-api-access-rw4kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.237715 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw4kw\" (UniqueName: \"kubernetes.io/projected/20e799eb-9c49-4025-98e1-b25be1bac66a-kube-api-access-rw4kw\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238716 4778 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238770 4778 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238800 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7dnt\" (UniqueName: \"kubernetes.io/projected/8560ebac-334f-4332-b324-cdb297a94b1a-kube-api-access-v7dnt\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238823 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238841 4778 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238859 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8560ebac-334f-4332-b324-cdb297a94b1a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238880 4778 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.658675 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djmq6-config-6ccgx" event={"ID":"20e799eb-9c49-4025-98e1-b25be1bac66a","Type":"ContainerDied","Data":"4282be7a65cf991e1b8ae29fb89dbb383c0cb63e6bfe1799f15c822d2653961d"} Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.659157 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4282be7a65cf991e1b8ae29fb89dbb383c0cb63e6bfe1799f15c822d2653961d" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.659039 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.662514 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zrjls" event={"ID":"8560ebac-334f-4332-b324-cdb297a94b1a","Type":"ContainerDied","Data":"b8c940ffd2808689a9da3bc2d7f8c95ee43229d10a3c9d4eb5fdacb3256bde90"} Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.662611 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8c940ffd2808689a9da3bc2d7f8c95ee43229d10a3c9d4eb5fdacb3256bde90" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.662802 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:32 crc kubenswrapper[4778]: I0318 09:22:32.156227 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-djmq6-config-6ccgx"] Mar 18 09:22:32 crc kubenswrapper[4778]: I0318 09:22:32.162053 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-djmq6-config-6ccgx"] Mar 18 09:22:32 crc kubenswrapper[4778]: I0318 09:22:32.196059 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20e799eb-9c49-4025-98e1-b25be1bac66a" path="/var/lib/kubelet/pods/20e799eb-9c49-4025-98e1-b25be1bac66a/volumes" Mar 18 09:22:35 crc kubenswrapper[4778]: I0318 09:22:35.715095 4778 generic.go:334] "Generic (PLEG): container finished" podID="5dadb643-21f7-497a-992f-41ab80c704c5" containerID="0cbb671a57344b775f4ff6d3749d585e5e115edb6d8d987453203f44ff882ff0" exitCode=0 Mar 18 09:22:35 crc kubenswrapper[4778]: I0318 09:22:35.715239 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b66ph" event={"ID":"5dadb643-21f7-497a-992f-41ab80c704c5","Type":"ContainerDied","Data":"0cbb671a57344b775f4ff6d3749d585e5e115edb6d8d987453203f44ff882ff0"} Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.121795 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.277963 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c69rn\" (UniqueName: \"kubernetes.io/projected/5dadb643-21f7-497a-992f-41ab80c704c5-kube-api-access-c69rn\") pod \"5dadb643-21f7-497a-992f-41ab80c704c5\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.278161 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-config-data\") pod \"5dadb643-21f7-497a-992f-41ab80c704c5\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.278259 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-db-sync-config-data\") pod \"5dadb643-21f7-497a-992f-41ab80c704c5\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.278283 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-combined-ca-bundle\") pod \"5dadb643-21f7-497a-992f-41ab80c704c5\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.285289 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dadb643-21f7-497a-992f-41ab80c704c5-kube-api-access-c69rn" (OuterVolumeSpecName: "kube-api-access-c69rn") pod "5dadb643-21f7-497a-992f-41ab80c704c5" (UID: "5dadb643-21f7-497a-992f-41ab80c704c5"). InnerVolumeSpecName "kube-api-access-c69rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.286009 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5dadb643-21f7-497a-992f-41ab80c704c5" (UID: "5dadb643-21f7-497a-992f-41ab80c704c5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.302524 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dadb643-21f7-497a-992f-41ab80c704c5" (UID: "5dadb643-21f7-497a-992f-41ab80c704c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.331460 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-config-data" (OuterVolumeSpecName: "config-data") pod "5dadb643-21f7-497a-992f-41ab80c704c5" (UID: "5dadb643-21f7-497a-992f-41ab80c704c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.379863 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.379909 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.379925 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.379940 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c69rn\" (UniqueName: \"kubernetes.io/projected/5dadb643-21f7-497a-992f-41ab80c704c5-kube-api-access-c69rn\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.735546 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b66ph" event={"ID":"5dadb643-21f7-497a-992f-41ab80c704c5","Type":"ContainerDied","Data":"2d28880ba4925c32ee75f685408b2e6019fc2833229df2500837da232ac04367"} Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.735597 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d28880ba4925c32ee75f685408b2e6019fc2833229df2500837da232ac04367" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.735665 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.179607 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-npjfc"] Mar 18 09:22:38 crc kubenswrapper[4778]: E0318 09:22:38.180578 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dadb643-21f7-497a-992f-41ab80c704c5" containerName="glance-db-sync" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.180596 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dadb643-21f7-497a-992f-41ab80c704c5" containerName="glance-db-sync" Mar 18 09:22:38 crc kubenswrapper[4778]: E0318 09:22:38.180614 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e799eb-9c49-4025-98e1-b25be1bac66a" containerName="ovn-config" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.180622 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e799eb-9c49-4025-98e1-b25be1bac66a" containerName="ovn-config" Mar 18 09:22:38 crc kubenswrapper[4778]: E0318 09:22:38.180636 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8560ebac-334f-4332-b324-cdb297a94b1a" containerName="mariadb-account-create-update" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.180644 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8560ebac-334f-4332-b324-cdb297a94b1a" containerName="mariadb-account-create-update" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.180825 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dadb643-21f7-497a-992f-41ab80c704c5" containerName="glance-db-sync" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.180839 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e799eb-9c49-4025-98e1-b25be1bac66a" containerName="ovn-config" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.180847 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8560ebac-334f-4332-b324-cdb297a94b1a" containerName="mariadb-account-create-update" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.181873 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.256020 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-npjfc"] Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.298930 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-config\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.299000 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-dns-svc\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.299039 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.299998 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.300074 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vb9\" (UniqueName: \"kubernetes.io/projected/4a490b75-6853-41f7-b5b3-46243c4c2166-kube-api-access-68vb9\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.402612 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.402753 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.402804 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vb9\" (UniqueName: \"kubernetes.io/projected/4a490b75-6853-41f7-b5b3-46243c4c2166-kube-api-access-68vb9\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.402895 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-config\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.402935 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-dns-svc\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.403902 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.404104 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-dns-svc\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.404326 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-config\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.404799 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.425629 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vb9\" (UniqueName: \"kubernetes.io/projected/4a490b75-6853-41f7-b5b3-46243c4c2166-kube-api-access-68vb9\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.561112 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:39 crc kubenswrapper[4778]: I0318 09:22:39.011496 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-npjfc"] Mar 18 09:22:39 crc kubenswrapper[4778]: I0318 09:22:39.754444 4778 generic.go:334] "Generic (PLEG): container finished" podID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerID="434cdf2be80022d070ec54085d25f3459b4f9eebfed357f2ab22cbeac663278b" exitCode=0 Mar 18 09:22:39 crc kubenswrapper[4778]: I0318 09:22:39.754495 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" event={"ID":"4a490b75-6853-41f7-b5b3-46243c4c2166","Type":"ContainerDied","Data":"434cdf2be80022d070ec54085d25f3459b4f9eebfed357f2ab22cbeac663278b"} Mar 18 09:22:39 crc kubenswrapper[4778]: I0318 09:22:39.755084 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" event={"ID":"4a490b75-6853-41f7-b5b3-46243c4c2166","Type":"ContainerStarted","Data":"5ee3b13ece4d9acc78176c498a90576572d6699436ce526238a6b4027ba90016"} Mar 18 09:22:39 crc kubenswrapper[4778]: I0318 09:22:39.764573 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:22:40 crc kubenswrapper[4778]: I0318 09:22:40.081472 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 09:22:40 crc kubenswrapper[4778]: I0318 09:22:40.765748 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" event={"ID":"4a490b75-6853-41f7-b5b3-46243c4c2166","Type":"ContainerStarted","Data":"28ca12d3a58a03f2fd89d9677d86730c400a702b9f62ca98b238e2b7a92046fd"} Mar 18 09:22:40 crc kubenswrapper[4778]: I0318 09:22:40.765979 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:40 crc kubenswrapper[4778]: I0318 09:22:40.785321 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" podStartSLOduration=2.785299626 podStartE2EDuration="2.785299626s" podCreationTimestamp="2026-03-18 09:22:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:40.783997041 +0000 UTC m=+1227.358741911" watchObservedRunningTime="2026-03-18 09:22:40.785299626 +0000 UTC m=+1227.360044476" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.716699 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-sz5dt"] Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.717959 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.729254 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sz5dt"] Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.823227 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b89b-account-create-update-ff8z8"] Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.824572 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.827694 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.844101 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b89b-account-create-update-ff8z8"] Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.864081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nlgg\" (UniqueName: \"kubernetes.io/projected/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-kube-api-access-7nlgg\") pod \"cinder-db-create-sz5dt\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.864271 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-operator-scripts\") pod \"cinder-db-create-sz5dt\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.965941 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nlgg\" (UniqueName: \"kubernetes.io/projected/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-kube-api-access-7nlgg\") pod \"cinder-db-create-sz5dt\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.966046 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320c5adc-a7d8-47a3-893b-7614c755446d-operator-scripts\") pod \"cinder-b89b-account-create-update-ff8z8\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.966084 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-operator-scripts\") pod \"cinder-db-create-sz5dt\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.966239 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjmzb\" (UniqueName: \"kubernetes.io/projected/320c5adc-a7d8-47a3-893b-7614c755446d-kube-api-access-hjmzb\") pod \"cinder-b89b-account-create-update-ff8z8\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.966797 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-operator-scripts\") pod \"cinder-db-create-sz5dt\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.006091 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nlgg\" (UniqueName: \"kubernetes.io/projected/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-kube-api-access-7nlgg\") pod \"cinder-db-create-sz5dt\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.011929 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-2cxtn"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.012842 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.022509 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6980-account-create-update-8lctt"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.024696 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.031950 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2cxtn"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.032156 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.043213 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6980-account-create-update-8lctt"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.043713 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.069522 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjmzb\" (UniqueName: \"kubernetes.io/projected/320c5adc-a7d8-47a3-893b-7614c755446d-kube-api-access-hjmzb\") pod \"cinder-b89b-account-create-update-ff8z8\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.069625 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320c5adc-a7d8-47a3-893b-7614c755446d-operator-scripts\") pod \"cinder-b89b-account-create-update-ff8z8\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.071166 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320c5adc-a7d8-47a3-893b-7614c755446d-operator-scripts\") pod \"cinder-b89b-account-create-update-ff8z8\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.107940 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjmzb\" (UniqueName: \"kubernetes.io/projected/320c5adc-a7d8-47a3-893b-7614c755446d-kube-api-access-hjmzb\") pod \"cinder-b89b-account-create-update-ff8z8\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.145593 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.158817 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-q979b"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.160530 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.171008 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfjht\" (UniqueName: \"kubernetes.io/projected/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-kube-api-access-vfjht\") pod \"barbican-db-create-2cxtn\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.172892 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-operator-scripts\") pod \"barbican-db-create-2cxtn\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.172985 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9719662a-4248-4c3c-860b-1a9e6547876b-operator-scripts\") pod \"barbican-6980-account-create-update-8lctt\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.173006 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-q979b"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.173163 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9kfz\" (UniqueName: \"kubernetes.io/projected/9719662a-4248-4c3c-860b-1a9e6547876b-kube-api-access-r9kfz\") pod \"barbican-6980-account-create-update-8lctt\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.274895 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9kfz\" (UniqueName: \"kubernetes.io/projected/9719662a-4248-4c3c-860b-1a9e6547876b-kube-api-access-r9kfz\") pod \"barbican-6980-account-create-update-8lctt\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.275306 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4e8f-account-create-update-ztvnt"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.277218 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.280189 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfjht\" (UniqueName: \"kubernetes.io/projected/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-kube-api-access-vfjht\") pod \"barbican-db-create-2cxtn\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.280281 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-operator-scripts\") pod \"barbican-db-create-2cxtn\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.280344 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88j2m\" (UniqueName: \"kubernetes.io/projected/dca6e4b2-4722-4a45-b577-33f3c5090fc3-kube-api-access-88j2m\") pod \"neutron-db-create-q979b\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.280553 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.281817 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-operator-scripts\") pod \"barbican-db-create-2cxtn\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.292726 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4e8f-account-create-update-ztvnt"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.301687 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9719662a-4248-4c3c-860b-1a9e6547876b-operator-scripts\") pod \"barbican-6980-account-create-update-8lctt\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.301747 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca6e4b2-4722-4a45-b577-33f3c5090fc3-operator-scripts\") pod \"neutron-db-create-q979b\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.302555 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9719662a-4248-4c3c-860b-1a9e6547876b-operator-scripts\") pod \"barbican-6980-account-create-update-8lctt\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.304902 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfjht\" (UniqueName: \"kubernetes.io/projected/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-kube-api-access-vfjht\") pod \"barbican-db-create-2cxtn\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.310775 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9kfz\" (UniqueName: \"kubernetes.io/projected/9719662a-4248-4c3c-860b-1a9e6547876b-kube-api-access-r9kfz\") pod \"barbican-6980-account-create-update-8lctt\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.316010 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-29tr5"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.317096 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.319991 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.320286 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.320503 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vcgh4" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.320708 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.326814 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-29tr5"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.339098 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.352822 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.403152 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf66d17-48b6-4629-ae0c-e270afa0c88a-operator-scripts\") pod \"neutron-4e8f-account-create-update-ztvnt\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.403255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88j2m\" (UniqueName: \"kubernetes.io/projected/dca6e4b2-4722-4a45-b577-33f3c5090fc3-kube-api-access-88j2m\") pod \"neutron-db-create-q979b\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.403280 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcbxm\" (UniqueName: \"kubernetes.io/projected/7cf66d17-48b6-4629-ae0c-e270afa0c88a-kube-api-access-vcbxm\") pod \"neutron-4e8f-account-create-update-ztvnt\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.403313 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca6e4b2-4722-4a45-b577-33f3c5090fc3-operator-scripts\") pod \"neutron-db-create-q979b\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.404100 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca6e4b2-4722-4a45-b577-33f3c5090fc3-operator-scripts\") pod \"neutron-db-create-q979b\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.433265 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88j2m\" (UniqueName: \"kubernetes.io/projected/dca6e4b2-4722-4a45-b577-33f3c5090fc3-kube-api-access-88j2m\") pod \"neutron-db-create-q979b\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.505299 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-combined-ca-bundle\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.505388 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf66d17-48b6-4629-ae0c-e270afa0c88a-operator-scripts\") pod \"neutron-4e8f-account-create-update-ztvnt\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.505448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcbxm\" (UniqueName: \"kubernetes.io/projected/7cf66d17-48b6-4629-ae0c-e270afa0c88a-kube-api-access-vcbxm\") pod \"neutron-4e8f-account-create-update-ztvnt\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.505531 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-config-data\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.505562 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwgrf\" (UniqueName: \"kubernetes.io/projected/bb6efa68-d15c-4d69-bd52-853a7cef8299-kube-api-access-nwgrf\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.506409 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf66d17-48b6-4629-ae0c-e270afa0c88a-operator-scripts\") pod \"neutron-4e8f-account-create-update-ztvnt\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.523338 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcbxm\" (UniqueName: \"kubernetes.io/projected/7cf66d17-48b6-4629-ae0c-e270afa0c88a-kube-api-access-vcbxm\") pod \"neutron-4e8f-account-create-update-ztvnt\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.543782 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.607711 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-config-data\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.607750 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwgrf\" (UniqueName: \"kubernetes.io/projected/bb6efa68-d15c-4d69-bd52-853a7cef8299-kube-api-access-nwgrf\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.607808 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-combined-ca-bundle\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.611640 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-combined-ca-bundle\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.614765 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-config-data\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.622691 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.630734 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwgrf\" (UniqueName: \"kubernetes.io/projected/bb6efa68-d15c-4d69-bd52-853a7cef8299-kube-api-access-nwgrf\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.633723 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.669664 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sz5dt"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.817559 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sz5dt" event={"ID":"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e","Type":"ContainerStarted","Data":"0165c76373e8bba304aac3a9dbc098778cf0c36e469c92578c0e2cf22008740f"} Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.823480 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2cxtn"] Mar 18 09:22:42 crc kubenswrapper[4778]: W0318 09:22:42.851435 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60fea5d6_a85d_40e3_81ef_1d499ba2ebf7.slice/crio-2e7478ba77bcc10d706da8ee103c2baadaecc953ed1cb3b02addae36d8d8e954 WatchSource:0}: Error finding container 2e7478ba77bcc10d706da8ee103c2baadaecc953ed1cb3b02addae36d8d8e954: Status 404 returned error can't find the container with id 2e7478ba77bcc10d706da8ee103c2baadaecc953ed1cb3b02addae36d8d8e954 Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.885948 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b89b-account-create-update-ff8z8"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.921951 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6980-account-create-update-8lctt"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.955793 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-q979b"] Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.227169 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4e8f-account-create-update-ztvnt"] Mar 18 09:22:43 crc kubenswrapper[4778]: W0318 09:22:43.242223 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cf66d17_48b6_4629_ae0c_e270afa0c88a.slice/crio-bb44bfd1417a265b81c690612f8175c586724077fee8efd10a4a946e20794520 WatchSource:0}: Error finding container bb44bfd1417a265b81c690612f8175c586724077fee8efd10a4a946e20794520: Status 404 returned error can't find the container with id bb44bfd1417a265b81c690612f8175c586724077fee8efd10a4a946e20794520 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.319059 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-29tr5"] Mar 18 09:22:43 crc kubenswrapper[4778]: W0318 09:22:43.328620 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb6efa68_d15c_4d69_bd52_853a7cef8299.slice/crio-2a27443ef48ba4cd692dcd39b8ef54b2fc528f5d9adc50b0359b82445083a3ba WatchSource:0}: Error finding container 2a27443ef48ba4cd692dcd39b8ef54b2fc528f5d9adc50b0359b82445083a3ba: Status 404 returned error can't find the container with id 2a27443ef48ba4cd692dcd39b8ef54b2fc528f5d9adc50b0359b82445083a3ba Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.827248 4778 generic.go:334] "Generic (PLEG): container finished" podID="320c5adc-a7d8-47a3-893b-7614c755446d" containerID="61d521ae036849914a4701bb867f10a55ba71a80cea0eab40620c4a6aa10638d" exitCode=0 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.827557 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b89b-account-create-update-ff8z8" event={"ID":"320c5adc-a7d8-47a3-893b-7614c755446d","Type":"ContainerDied","Data":"61d521ae036849914a4701bb867f10a55ba71a80cea0eab40620c4a6aa10638d"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.827628 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b89b-account-create-update-ff8z8" event={"ID":"320c5adc-a7d8-47a3-893b-7614c755446d","Type":"ContainerStarted","Data":"d5bd6041f3d6a690cef4f200e37ab1d7e48da6ea531c761369a864292600b47c"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.829721 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-29tr5" event={"ID":"bb6efa68-d15c-4d69-bd52-853a7cef8299","Type":"ContainerStarted","Data":"2a27443ef48ba4cd692dcd39b8ef54b2fc528f5d9adc50b0359b82445083a3ba"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.847553 4778 generic.go:334] "Generic (PLEG): container finished" podID="35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" containerID="70d53574867291895895df87d2a68bed084a005ff2e35622dba06f6dac00a1ee" exitCode=0 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.847918 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sz5dt" event={"ID":"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e","Type":"ContainerDied","Data":"70d53574867291895895df87d2a68bed084a005ff2e35622dba06f6dac00a1ee"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.850268 4778 generic.go:334] "Generic (PLEG): container finished" podID="9719662a-4248-4c3c-860b-1a9e6547876b" containerID="76d9700b7eab0fc318bf79deafd901e277918a900858f790ff6f7d08ee5e2133" exitCode=0 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.850421 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6980-account-create-update-8lctt" event={"ID":"9719662a-4248-4c3c-860b-1a9e6547876b","Type":"ContainerDied","Data":"76d9700b7eab0fc318bf79deafd901e277918a900858f790ff6f7d08ee5e2133"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.850455 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6980-account-create-update-8lctt" event={"ID":"9719662a-4248-4c3c-860b-1a9e6547876b","Type":"ContainerStarted","Data":"6fb547a0f07403671e731bc4adef6aaddb845cc7a67233c9c5013a336629fe47"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.852678 4778 generic.go:334] "Generic (PLEG): container finished" podID="60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" containerID="aa018cf8a109c6b5750c2118b0eb74eb108759f278376c856e84638cf2d31164" exitCode=0 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.852758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2cxtn" event={"ID":"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7","Type":"ContainerDied","Data":"aa018cf8a109c6b5750c2118b0eb74eb108759f278376c856e84638cf2d31164"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.852807 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2cxtn" event={"ID":"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7","Type":"ContainerStarted","Data":"2e7478ba77bcc10d706da8ee103c2baadaecc953ed1cb3b02addae36d8d8e954"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.855509 4778 generic.go:334] "Generic (PLEG): container finished" podID="dca6e4b2-4722-4a45-b577-33f3c5090fc3" containerID="c1c9a9ac26d842e9f380c2bc10d90467713b713ce0c3ac97f08ea834682773ee" exitCode=0 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.855588 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q979b" event={"ID":"dca6e4b2-4722-4a45-b577-33f3c5090fc3","Type":"ContainerDied","Data":"c1c9a9ac26d842e9f380c2bc10d90467713b713ce0c3ac97f08ea834682773ee"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.855610 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q979b" event={"ID":"dca6e4b2-4722-4a45-b577-33f3c5090fc3","Type":"ContainerStarted","Data":"90b9fbf4c7f0c1bd9271bb45aa6b12e259b1bdfb733dd018092b94b0a0987c32"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.861419 4778 generic.go:334] "Generic (PLEG): container finished" podID="7cf66d17-48b6-4629-ae0c-e270afa0c88a" containerID="1b70618e3e5fc20f170a550bf06d195893c5a61a58727ad242d6881bbcef4e7a" exitCode=0 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.861463 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e8f-account-create-update-ztvnt" event={"ID":"7cf66d17-48b6-4629-ae0c-e270afa0c88a","Type":"ContainerDied","Data":"1b70618e3e5fc20f170a550bf06d195893c5a61a58727ad242d6881bbcef4e7a"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.861489 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e8f-account-create-update-ztvnt" event={"ID":"7cf66d17-48b6-4629-ae0c-e270afa0c88a","Type":"ContainerStarted","Data":"bb44bfd1417a265b81c690612f8175c586724077fee8efd10a4a946e20794520"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.787041 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.800960 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.842824 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.851083 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q979b" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.868739 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.905451 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.907382 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcbxm\" (UniqueName: \"kubernetes.io/projected/7cf66d17-48b6-4629-ae0c-e270afa0c88a-kube-api-access-vcbxm\") pod \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.907449 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca6e4b2-4722-4a45-b577-33f3c5090fc3-operator-scripts\") pod \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.907503 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nlgg\" (UniqueName: \"kubernetes.io/projected/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-kube-api-access-7nlgg\") pod \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.908752 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dca6e4b2-4722-4a45-b577-33f3c5090fc3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dca6e4b2-4722-4a45-b577-33f3c5090fc3" (UID: "dca6e4b2-4722-4a45-b577-33f3c5090fc3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.910297 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-operator-scripts\") pod \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.910362 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9kfz\" (UniqueName: \"kubernetes.io/projected/9719662a-4248-4c3c-860b-1a9e6547876b-kube-api-access-r9kfz\") pod \"9719662a-4248-4c3c-860b-1a9e6547876b\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.910413 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9719662a-4248-4c3c-860b-1a9e6547876b-operator-scripts\") pod \"9719662a-4248-4c3c-860b-1a9e6547876b\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.910483 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88j2m\" (UniqueName: \"kubernetes.io/projected/dca6e4b2-4722-4a45-b577-33f3c5090fc3-kube-api-access-88j2m\") pod \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.910542 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf66d17-48b6-4629-ae0c-e270afa0c88a-operator-scripts\") pod \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.911915 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca6e4b2-4722-4a45-b577-33f3c5090fc3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.914975 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.915147 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" (UID: "35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.915796 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2cxtn" event={"ID":"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7","Type":"ContainerDied","Data":"2e7478ba77bcc10d706da8ee103c2baadaecc953ed1cb3b02addae36d8d8e954"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.915912 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e7478ba77bcc10d706da8ee103c2baadaecc953ed1cb3b02addae36d8d8e954" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.916114 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9719662a-4248-4c3c-860b-1a9e6547876b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9719662a-4248-4c3c-860b-1a9e6547876b" (UID: "9719662a-4248-4c3c-860b-1a9e6547876b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.916283 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf66d17-48b6-4629-ae0c-e270afa0c88a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7cf66d17-48b6-4629-ae0c-e270afa0c88a" (UID: "7cf66d17-48b6-4629-ae0c-e270afa0c88a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.919708 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q979b" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.920469 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q979b" event={"ID":"dca6e4b2-4722-4a45-b577-33f3c5090fc3","Type":"ContainerDied","Data":"90b9fbf4c7f0c1bd9271bb45aa6b12e259b1bdfb733dd018092b94b0a0987c32"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.920517 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90b9fbf4c7f0c1bd9271bb45aa6b12e259b1bdfb733dd018092b94b0a0987c32" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.921515 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca6e4b2-4722-4a45-b577-33f3c5090fc3-kube-api-access-88j2m" (OuterVolumeSpecName: "kube-api-access-88j2m") pod "dca6e4b2-4722-4a45-b577-33f3c5090fc3" (UID: "dca6e4b2-4722-4a45-b577-33f3c5090fc3"). InnerVolumeSpecName "kube-api-access-88j2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.924247 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.924330 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e8f-account-create-update-ztvnt" event={"ID":"7cf66d17-48b6-4629-ae0c-e270afa0c88a","Type":"ContainerDied","Data":"bb44bfd1417a265b81c690612f8175c586724077fee8efd10a4a946e20794520"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.924352 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb44bfd1417a265b81c690612f8175c586724077fee8efd10a4a946e20794520" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.924672 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf66d17-48b6-4629-ae0c-e270afa0c88a-kube-api-access-vcbxm" (OuterVolumeSpecName: "kube-api-access-vcbxm") pod "7cf66d17-48b6-4629-ae0c-e270afa0c88a" (UID: "7cf66d17-48b6-4629-ae0c-e270afa0c88a"). InnerVolumeSpecName "kube-api-access-vcbxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.925439 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-kube-api-access-7nlgg" (OuterVolumeSpecName: "kube-api-access-7nlgg") pod "35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" (UID: "35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e"). InnerVolumeSpecName "kube-api-access-7nlgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.927304 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9719662a-4248-4c3c-860b-1a9e6547876b-kube-api-access-r9kfz" (OuterVolumeSpecName: "kube-api-access-r9kfz") pod "9719662a-4248-4c3c-860b-1a9e6547876b" (UID: "9719662a-4248-4c3c-860b-1a9e6547876b"). InnerVolumeSpecName "kube-api-access-r9kfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.940955 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b89b-account-create-update-ff8z8" event={"ID":"320c5adc-a7d8-47a3-893b-7614c755446d","Type":"ContainerDied","Data":"d5bd6041f3d6a690cef4f200e37ab1d7e48da6ea531c761369a864292600b47c"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.941003 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.941015 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5bd6041f3d6a690cef4f200e37ab1d7e48da6ea531c761369a864292600b47c" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.943854 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.947336 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sz5dt" event={"ID":"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e","Type":"ContainerDied","Data":"0165c76373e8bba304aac3a9dbc098778cf0c36e469c92578c0e2cf22008740f"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.947689 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0165c76373e8bba304aac3a9dbc098778cf0c36e469c92578c0e2cf22008740f" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.958402 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.958731 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6980-account-create-update-8lctt" event={"ID":"9719662a-4248-4c3c-860b-1a9e6547876b","Type":"ContainerDied","Data":"6fb547a0f07403671e731bc4adef6aaddb845cc7a67233c9c5013a336629fe47"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.958803 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fb547a0f07403671e731bc4adef6aaddb845cc7a67233c9c5013a336629fe47" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.012529 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfjht\" (UniqueName: \"kubernetes.io/projected/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-kube-api-access-vfjht\") pod \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.012722 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjmzb\" (UniqueName: \"kubernetes.io/projected/320c5adc-a7d8-47a3-893b-7614c755446d-kube-api-access-hjmzb\") pod \"320c5adc-a7d8-47a3-893b-7614c755446d\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.012813 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-operator-scripts\") pod \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.012836 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320c5adc-a7d8-47a3-893b-7614c755446d-operator-scripts\") pod \"320c5adc-a7d8-47a3-893b-7614c755446d\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013156 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013175 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9kfz\" (UniqueName: \"kubernetes.io/projected/9719662a-4248-4c3c-860b-1a9e6547876b-kube-api-access-r9kfz\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013188 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9719662a-4248-4c3c-860b-1a9e6547876b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013216 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88j2m\" (UniqueName: \"kubernetes.io/projected/dca6e4b2-4722-4a45-b577-33f3c5090fc3-kube-api-access-88j2m\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013229 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf66d17-48b6-4629-ae0c-e270afa0c88a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013243 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcbxm\" (UniqueName: \"kubernetes.io/projected/7cf66d17-48b6-4629-ae0c-e270afa0c88a-kube-api-access-vcbxm\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013257 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nlgg\" (UniqueName: \"kubernetes.io/projected/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-kube-api-access-7nlgg\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013373 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" (UID: "60fea5d6-a85d-40e3-81ef-1d499ba2ebf7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.014272 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/320c5adc-a7d8-47a3-893b-7614c755446d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "320c5adc-a7d8-47a3-893b-7614c755446d" (UID: "320c5adc-a7d8-47a3-893b-7614c755446d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.016172 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-kube-api-access-vfjht" (OuterVolumeSpecName: "kube-api-access-vfjht") pod "60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" (UID: "60fea5d6-a85d-40e3-81ef-1d499ba2ebf7"). InnerVolumeSpecName "kube-api-access-vfjht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.017631 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320c5adc-a7d8-47a3-893b-7614c755446d-kube-api-access-hjmzb" (OuterVolumeSpecName: "kube-api-access-hjmzb") pod "320c5adc-a7d8-47a3-893b-7614c755446d" (UID: "320c5adc-a7d8-47a3-893b-7614c755446d"). InnerVolumeSpecName "kube-api-access-hjmzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.115579 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.116098 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320c5adc-a7d8-47a3-893b-7614c755446d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.116165 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfjht\" (UniqueName: \"kubernetes.io/projected/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-kube-api-access-vfjht\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.116274 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjmzb\" (UniqueName: \"kubernetes.io/projected/320c5adc-a7d8-47a3-893b-7614c755446d-kube-api-access-hjmzb\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.563937 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.643265 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jt8gb"] Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.643519 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-jt8gb" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerName="dnsmasq-dns" containerID="cri-o://0cb5262a9d3d057ca53ec602e21c523692c8679333829bacc24282be5307d258" gracePeriod=10 Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.979184 4778 generic.go:334] "Generic (PLEG): container finished" podID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerID="0cb5262a9d3d057ca53ec602e21c523692c8679333829bacc24282be5307d258" exitCode=0 Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.979641 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jt8gb" event={"ID":"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5","Type":"ContainerDied","Data":"0cb5262a9d3d057ca53ec602e21c523692c8679333829bacc24282be5307d258"} Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.981296 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-29tr5" event={"ID":"bb6efa68-d15c-4d69-bd52-853a7cef8299","Type":"ContainerStarted","Data":"8c18edac4f9dbd5b2d5ca7397e82de877b46a7b70e08421fff14ad85201dac7d"} Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.001316 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-29tr5" podStartSLOduration=2.7645939459999997 podStartE2EDuration="7.001293304s" podCreationTimestamp="2026-03-18 09:22:42 +0000 UTC" firstStartedPulling="2026-03-18 09:22:43.3333647 +0000 UTC m=+1229.908109540" lastFinishedPulling="2026-03-18 09:22:47.570064048 +0000 UTC m=+1234.144808898" observedRunningTime="2026-03-18 09:22:49.000539093 +0000 UTC m=+1235.575283953" watchObservedRunningTime="2026-03-18 09:22:49.001293304 +0000 UTC m=+1235.576038144" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.181777 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.245326 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-dns-svc\") pod \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.245383 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-config\") pod \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.245595 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-nb\") pod \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.245624 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-sb\") pod \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.245725 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqgnd\" (UniqueName: \"kubernetes.io/projected/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-kube-api-access-qqgnd\") pod \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.260774 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-kube-api-access-qqgnd" (OuterVolumeSpecName: "kube-api-access-qqgnd") pod "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" (UID: "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5"). InnerVolumeSpecName "kube-api-access-qqgnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.316164 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" (UID: "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.345661 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-config" (OuterVolumeSpecName: "config") pod "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" (UID: "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.347997 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqgnd\" (UniqueName: \"kubernetes.io/projected/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-kube-api-access-qqgnd\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.348036 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.348049 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.348695 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" (UID: "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.361263 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" (UID: "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.450312 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.450698 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.995499 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jt8gb" event={"ID":"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5","Type":"ContainerDied","Data":"95b556045a6555de0e6526548932d8f77419a56ad86f2e3afbe7000b4b9694c6"} Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.995633 4778 scope.go:117] "RemoveContainer" containerID="0cb5262a9d3d057ca53ec602e21c523692c8679333829bacc24282be5307d258" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.995528 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:22:50 crc kubenswrapper[4778]: I0318 09:22:50.028620 4778 scope.go:117] "RemoveContainer" containerID="67e25fe46d44f377f33f04f4df4c1bf2f96243fad2d8a6135f426851e5ac8c58" Mar 18 09:22:50 crc kubenswrapper[4778]: I0318 09:22:50.044786 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jt8gb"] Mar 18 09:22:50 crc kubenswrapper[4778]: I0318 09:22:50.068558 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jt8gb"] Mar 18 09:22:50 crc kubenswrapper[4778]: I0318 09:22:50.196519 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" path="/var/lib/kubelet/pods/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5/volumes" Mar 18 09:22:51 crc kubenswrapper[4778]: E0318 09:22:51.006658 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb6efa68_d15c_4d69_bd52_853a7cef8299.slice/crio-8c18edac4f9dbd5b2d5ca7397e82de877b46a7b70e08421fff14ad85201dac7d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb6efa68_d15c_4d69_bd52_853a7cef8299.slice/crio-conmon-8c18edac4f9dbd5b2d5ca7397e82de877b46a7b70e08421fff14ad85201dac7d.scope\": RecentStats: unable to find data in memory cache]" Mar 18 09:22:51 crc kubenswrapper[4778]: I0318 09:22:51.010110 4778 generic.go:334] "Generic (PLEG): container finished" podID="bb6efa68-d15c-4d69-bd52-853a7cef8299" containerID="8c18edac4f9dbd5b2d5ca7397e82de877b46a7b70e08421fff14ad85201dac7d" exitCode=0 Mar 18 09:22:51 crc kubenswrapper[4778]: I0318 09:22:51.010176 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-29tr5" event={"ID":"bb6efa68-d15c-4d69-bd52-853a7cef8299","Type":"ContainerDied","Data":"8c18edac4f9dbd5b2d5ca7397e82de877b46a7b70e08421fff14ad85201dac7d"} Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.366495 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.504298 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwgrf\" (UniqueName: \"kubernetes.io/projected/bb6efa68-d15c-4d69-bd52-853a7cef8299-kube-api-access-nwgrf\") pod \"bb6efa68-d15c-4d69-bd52-853a7cef8299\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.504436 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-combined-ca-bundle\") pod \"bb6efa68-d15c-4d69-bd52-853a7cef8299\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.504515 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-config-data\") pod \"bb6efa68-d15c-4d69-bd52-853a7cef8299\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.511029 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6efa68-d15c-4d69-bd52-853a7cef8299-kube-api-access-nwgrf" (OuterVolumeSpecName: "kube-api-access-nwgrf") pod "bb6efa68-d15c-4d69-bd52-853a7cef8299" (UID: "bb6efa68-d15c-4d69-bd52-853a7cef8299"). InnerVolumeSpecName "kube-api-access-nwgrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.534718 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb6efa68-d15c-4d69-bd52-853a7cef8299" (UID: "bb6efa68-d15c-4d69-bd52-853a7cef8299"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.564459 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-config-data" (OuterVolumeSpecName: "config-data") pod "bb6efa68-d15c-4d69-bd52-853a7cef8299" (UID: "bb6efa68-d15c-4d69-bd52-853a7cef8299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.607132 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.607174 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwgrf\" (UniqueName: \"kubernetes.io/projected/bb6efa68-d15c-4d69-bd52-853a7cef8299-kube-api-access-nwgrf\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.607187 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.032500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-29tr5" event={"ID":"bb6efa68-d15c-4d69-bd52-853a7cef8299","Type":"ContainerDied","Data":"2a27443ef48ba4cd692dcd39b8ef54b2fc528f5d9adc50b0359b82445083a3ba"} Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.032568 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a27443ef48ba4cd692dcd39b8ef54b2fc528f5d9adc50b0359b82445083a3ba" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.032573 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219374 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67795cd9-m2fhf"] Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219787 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219810 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219827 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320c5adc-a7d8-47a3-893b-7614c755446d" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219839 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="320c5adc-a7d8-47a3-893b-7614c755446d" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219853 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219861 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219876 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerName="init" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219883 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerName="init" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219902 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6efa68-d15c-4d69-bd52-853a7cef8299" containerName="keystone-db-sync" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219910 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6efa68-d15c-4d69-bd52-853a7cef8299" containerName="keystone-db-sync" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219921 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9719662a-4248-4c3c-860b-1a9e6547876b" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219929 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9719662a-4248-4c3c-860b-1a9e6547876b" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219944 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerName="dnsmasq-dns" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219951 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerName="dnsmasq-dns" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219965 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf66d17-48b6-4629-ae0c-e270afa0c88a" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219973 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf66d17-48b6-4629-ae0c-e270afa0c88a" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219987 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca6e4b2-4722-4a45-b577-33f3c5090fc3" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219995 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca6e4b2-4722-4a45-b577-33f3c5090fc3" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220198 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6efa68-d15c-4d69-bd52-853a7cef8299" containerName="keystone-db-sync" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220310 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9719662a-4248-4c3c-860b-1a9e6547876b" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220324 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="320c5adc-a7d8-47a3-893b-7614c755446d" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220332 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca6e4b2-4722-4a45-b577-33f3c5090fc3" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220341 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerName="dnsmasq-dns" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220356 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220373 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf66d17-48b6-4629-ae0c-e270afa0c88a" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220385 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.228537 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.239308 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pttzb"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.240552 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.243976 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vcgh4" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.244241 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.244395 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.245145 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.246121 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.258691 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pttzb"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.312321 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-m2fhf"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.325890 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.325962 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-config-data\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326010 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npsqv\" (UniqueName: \"kubernetes.io/projected/45f70c00-938c-4f67-9c5b-4a88c90b62ae-kube-api-access-npsqv\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326042 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326088 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-dns-svc\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326120 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-fernet-keys\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326163 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9lph\" (UniqueName: \"kubernetes.io/projected/835f3aad-57da-48e7-ac5a-f0635ee9bc98-kube-api-access-x9lph\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326246 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-credential-keys\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326283 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-scripts\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326677 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-config\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326723 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-combined-ca-bundle\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445072 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-fernet-keys\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445137 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9lph\" (UniqueName: \"kubernetes.io/projected/835f3aad-57da-48e7-ac5a-f0635ee9bc98-kube-api-access-x9lph\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445182 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-credential-keys\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445612 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-scripts\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445665 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-config\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445694 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-combined-ca-bundle\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445740 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445772 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-config-data\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445823 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npsqv\" (UniqueName: \"kubernetes.io/projected/45f70c00-938c-4f67-9c5b-4a88c90b62ae-kube-api-access-npsqv\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445857 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445891 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-dns-svc\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.446955 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-dns-svc\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.449386 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-zbghp"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.451326 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-fernet-keys\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.459874 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.460474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.461402 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-config\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.461722 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-config-data\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.463699 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.463694 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-credential-keys\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.466681 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-combined-ca-bundle\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.471266 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67f887b5-9vsz7"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.472701 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.475288 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-scripts\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.475545 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tvhb7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.476767 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.490498 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.499508 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.499705 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.499819 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.499986 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-f4kcp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.500426 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zbghp"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.520262 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67f887b5-9vsz7"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.523051 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npsqv\" (UniqueName: \"kubernetes.io/projected/45f70c00-938c-4f67-9c5b-4a88c90b62ae-kube-api-access-npsqv\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.523463 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9lph\" (UniqueName: \"kubernetes.io/projected/835f3aad-57da-48e7-ac5a-f0635ee9bc98-kube-api-access-x9lph\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.525898 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jb4ss"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.526905 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.534394 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.534593 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r86tv" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.534743 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.542422 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jb4ss"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550008 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-scripts\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550118 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f507e196-94ca-4c4a-91f4-de3587084d30-logs\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550148 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f507e196-94ca-4c4a-91f4-de3587084d30-horizon-secret-key\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550167 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcdfb\" (UniqueName: \"kubernetes.io/projected/4d42905f-c189-4021-834d-f2a81dae5a4a-kube-api-access-lcdfb\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550192 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-config\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550226 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-combined-ca-bundle\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550252 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzl9p\" (UniqueName: \"kubernetes.io/projected/f507e196-94ca-4c4a-91f4-de3587084d30-kube-api-access-gzl9p\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550269 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-config-data\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.553955 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.575865 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.629783 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-m2fhf"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651248 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fba399d9-71ac-41c3-912f-32ccc7fc6190-etc-machine-id\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651296 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-combined-ca-bundle\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-config-data\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651469 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9z5l\" (UniqueName: \"kubernetes.io/projected/fba399d9-71ac-41c3-912f-32ccc7fc6190-kube-api-access-c9z5l\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651638 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-db-sync-config-data\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651670 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f507e196-94ca-4c4a-91f4-de3587084d30-logs\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651702 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-scripts\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651747 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f507e196-94ca-4c4a-91f4-de3587084d30-horizon-secret-key\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651772 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcdfb\" (UniqueName: \"kubernetes.io/projected/4d42905f-c189-4021-834d-f2a81dae5a4a-kube-api-access-lcdfb\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651825 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-config\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651842 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-combined-ca-bundle\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651890 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzl9p\" (UniqueName: \"kubernetes.io/projected/f507e196-94ca-4c4a-91f4-de3587084d30-kube-api-access-gzl9p\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651917 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-config-data\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651950 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-scripts\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.652813 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-scripts\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.653054 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f507e196-94ca-4c4a-91f4-de3587084d30-logs\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.656507 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-config-data\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.660981 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f507e196-94ca-4c4a-91f4-de3587084d30-horizon-secret-key\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.667847 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-combined-ca-bundle\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.668493 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-config\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.688470 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-98prp"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.690548 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.695261 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-66r47" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.695458 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.703871 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcdfb\" (UniqueName: \"kubernetes.io/projected/4d42905f-c189-4021-834d-f2a81dae5a4a-kube-api-access-lcdfb\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.721785 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-98prp"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.727637 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzl9p\" (UniqueName: \"kubernetes.io/projected/f507e196-94ca-4c4a-91f4-de3587084d30-kube-api-access-gzl9p\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.747277 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2p9jg"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.748289 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.762692 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4lsnj" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.762939 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763422 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763187 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-db-sync-config-data\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763565 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-scripts\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-combined-ca-bundle\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763768 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-db-sync-config-data\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763853 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fba399d9-71ac-41c3-912f-32ccc7fc6190-etc-machine-id\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763891 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-combined-ca-bundle\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763908 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gnxr\" (UniqueName: \"kubernetes.io/projected/4135fc20-df28-4f8d-b244-aedd5ed57cc2-kube-api-access-7gnxr\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-config-data\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763976 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9z5l\" (UniqueName: \"kubernetes.io/projected/fba399d9-71ac-41c3-912f-32ccc7fc6190-kube-api-access-c9z5l\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.765064 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fba399d9-71ac-41c3-912f-32ccc7fc6190-etc-machine-id\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.772839 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-db-sync-config-data\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.780511 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-scripts\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.780974 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-nz24n"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.782570 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.781028 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-combined-ca-bundle\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.800022 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9z5l\" (UniqueName: \"kubernetes.io/projected/fba399d9-71ac-41c3-912f-32ccc7fc6190-kube-api-access-c9z5l\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.814087 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2p9jg"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.814217 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-config-data\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.849260 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-nz24n"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.865242 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-568467c8dc-v4vlb"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.866636 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.867992 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7f7g\" (UniqueName: \"kubernetes.io/projected/20eafe8e-c0b9-4463-bc12-8c0cd0359968-kube-api-access-s7f7g\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868014 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-combined-ca-bundle\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868059 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gnxr\" (UniqueName: \"kubernetes.io/projected/4135fc20-df28-4f8d-b244-aedd5ed57cc2-kube-api-access-7gnxr\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868088 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868104 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868153 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20eafe8e-c0b9-4463-bc12-8c0cd0359968-logs\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868193 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-combined-ca-bundle\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868226 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868249 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-config\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868285 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lwww\" (UniqueName: \"kubernetes.io/projected/0ccbef26-8b5f-4b83-885f-3c074207eb73-kube-api-access-4lwww\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-scripts\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868320 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-config-data\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868338 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-db-sync-config-data\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.884059 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-combined-ca-bundle\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.884440 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-db-sync-config-data\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.896712 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gnxr\" (UniqueName: \"kubernetes.io/projected/4135fc20-df28-4f8d-b244-aedd5ed57cc2-kube-api-access-7gnxr\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.896767 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-568467c8dc-v4vlb"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.905091 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.907737 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.909644 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.909850 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.922155 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969818 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-config\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969871 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lwww\" (UniqueName: \"kubernetes.io/projected/0ccbef26-8b5f-4b83-885f-3c074207eb73-kube-api-access-4lwww\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969890 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-scripts\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969909 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-config-data\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969934 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-config-data\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969967 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7f7g\" (UniqueName: \"kubernetes.io/projected/20eafe8e-c0b9-4463-bc12-8c0cd0359968-kube-api-access-s7f7g\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969982 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-combined-ca-bundle\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970003 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psmpb\" (UniqueName: \"kubernetes.io/projected/14a62749-8336-4894-a162-85350096aef4-kube-api-access-psmpb\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970035 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970052 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970067 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a62749-8336-4894-a162-85350096aef4-logs\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970091 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14a62749-8336-4894-a162-85350096aef4-horizon-secret-key\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970130 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20eafe8e-c0b9-4463-bc12-8c0cd0359968-logs\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970149 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-scripts\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970166 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970918 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-config\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.971583 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.971628 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.972539 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.972655 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.977498 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20eafe8e-c0b9-4463-bc12-8c0cd0359968-logs\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.978004 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-config-data\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.980826 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-scripts\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.989756 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.989878 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-combined-ca-bundle\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.994591 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lwww\" (UniqueName: \"kubernetes.io/projected/0ccbef26-8b5f-4b83-885f-3c074207eb73-kube-api-access-4lwww\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.994846 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7f7g\" (UniqueName: \"kubernetes.io/projected/20eafe8e-c0b9-4463-bc12-8c0cd0359968-kube-api-access-s7f7g\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.008687 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.045863 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071310 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-log-httpd\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071374 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t77rr\" (UniqueName: \"kubernetes.io/projected/d609db91-e011-4ac4-91a6-a9ba51f3918e-kube-api-access-t77rr\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071400 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071431 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psmpb\" (UniqueName: \"kubernetes.io/projected/14a62749-8336-4894-a162-85350096aef4-kube-api-access-psmpb\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071468 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a62749-8336-4894-a162-85350096aef4-logs\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071493 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14a62749-8336-4894-a162-85350096aef4-horizon-secret-key\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071510 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071554 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-run-httpd\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071571 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-scripts\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071600 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-scripts\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071627 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-config-data\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071655 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-config-data\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.072917 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-config-data\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.076875 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a62749-8336-4894-a162-85350096aef4-logs\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.078886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-scripts\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.083555 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14a62749-8336-4894-a162-85350096aef4-horizon-secret-key\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.098734 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psmpb\" (UniqueName: \"kubernetes.io/projected/14a62749-8336-4894-a162-85350096aef4-kube-api-access-psmpb\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.101347 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.158345 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173004 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173081 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-run-httpd\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173164 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-scripts\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173207 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-config-data\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173247 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-log-httpd\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173266 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t77rr\" (UniqueName: \"kubernetes.io/projected/d609db91-e011-4ac4-91a6-a9ba51f3918e-kube-api-access-t77rr\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173296 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.180238 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.181904 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-scripts\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.183328 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-config-data\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.183602 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-log-httpd\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.184406 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-run-httpd\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.190663 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.204400 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.208962 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t77rr\" (UniqueName: \"kubernetes.io/projected/d609db91-e011-4ac4-91a6-a9ba51f3918e-kube-api-access-t77rr\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.293789 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.422514 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-568467c8dc-v4vlb"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.457767 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7967bcbb45-bl6c8"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.460726 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.495050 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7967bcbb45-bl6c8"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.598989 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-scripts\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.599084 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.599115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-horizon-secret-key\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.599145 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l9tz\" (UniqueName: \"kubernetes.io/projected/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-kube-api-access-8l9tz\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.599172 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-logs\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.700547 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-scripts\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.700643 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.700689 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-horizon-secret-key\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.700729 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l9tz\" (UniqueName: \"kubernetes.io/projected/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-kube-api-access-8l9tz\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.700768 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-logs\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.701430 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-logs\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.702170 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-scripts\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.703458 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.709075 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-horizon-secret-key\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.727889 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l9tz\" (UniqueName: \"kubernetes.io/projected/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-kube-api-access-8l9tz\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.783484 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.833332 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jb4ss"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.852453 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:22:55 crc kubenswrapper[4778]: W0318 09:22:55.878646 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfba399d9_71ac_41c3_912f_32ccc7fc6190.slice/crio-fef33d3c7f1855b88b205e58a953db1eba3e6979fcc1430ebec66414bb961b50 WatchSource:0}: Error finding container fef33d3c7f1855b88b205e58a953db1eba3e6979fcc1430ebec66414bb961b50: Status 404 returned error can't find the container with id fef33d3c7f1855b88b205e58a953db1eba3e6979fcc1430ebec66414bb961b50 Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.892455 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-m2fhf"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.907027 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-568467c8dc-v4vlb"] Mar 18 09:22:55 crc kubenswrapper[4778]: W0318 09:22:55.910515 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20eafe8e_c0b9_4463_bc12_8c0cd0359968.slice/crio-f3040164c08e605fe47fcb403b7a54ffe6495d3760996218e88d3582673d242b WatchSource:0}: Error finding container f3040164c08e605fe47fcb403b7a54ffe6495d3760996218e88d3582673d242b: Status 404 returned error can't find the container with id f3040164c08e605fe47fcb403b7a54ffe6495d3760996218e88d3582673d242b Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.919286 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zbghp"] Mar 18 09:22:55 crc kubenswrapper[4778]: W0318 09:22:55.925896 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d42905f_c189_4021_834d_f2a81dae5a4a.slice/crio-0ae8af7be7deced401f81be079f01e4b10cb21029c529a4299ba7b28d3003f40 WatchSource:0}: Error finding container 0ae8af7be7deced401f81be079f01e4b10cb21029c529a4299ba7b28d3003f40: Status 404 returned error can't find the container with id 0ae8af7be7deced401f81be079f01e4b10cb21029c529a4299ba7b28d3003f40 Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.928073 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-nz24n"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.935604 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2p9jg"] Mar 18 09:22:55 crc kubenswrapper[4778]: W0318 09:22:55.940292 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf507e196_94ca_4c4a_91f4_de3587084d30.slice/crio-47668775c33692efd81b7bc7a0bfacb000692e6da118d403a58db7fb6562c979 WatchSource:0}: Error finding container 47668775c33692efd81b7bc7a0bfacb000692e6da118d403a58db7fb6562c979: Status 404 returned error can't find the container with id 47668775c33692efd81b7bc7a0bfacb000692e6da118d403a58db7fb6562c979 Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.943270 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67f887b5-9vsz7"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.950771 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-98prp"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.969052 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pttzb"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.983180 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.162507 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zbghp" event={"ID":"4d42905f-c189-4021-834d-f2a81dae5a4a","Type":"ContainerStarted","Data":"0ae8af7be7deced401f81be079f01e4b10cb21029c529a4299ba7b28d3003f40"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.185100 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" event={"ID":"45f70c00-938c-4f67-9c5b-4a88c90b62ae","Type":"ContainerStarted","Data":"2ad9afed41c247bfddfd9b123a9c47b45acee2887be9ed80d8c3249a1334c51d"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.206547 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jb4ss" event={"ID":"fba399d9-71ac-41c3-912f-32ccc7fc6190","Type":"ContainerStarted","Data":"fef33d3c7f1855b88b205e58a953db1eba3e6979fcc1430ebec66414bb961b50"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.207375 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-568467c8dc-v4vlb" event={"ID":"14a62749-8336-4894-a162-85350096aef4","Type":"ContainerStarted","Data":"72adaf6b6c1593294d8954a516b806ad25d5b5421c025d726c18c87806d6bbc1"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.207528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2p9jg" event={"ID":"20eafe8e-c0b9-4463-bc12-8c0cd0359968","Type":"ContainerStarted","Data":"f3040164c08e605fe47fcb403b7a54ffe6495d3760996218e88d3582673d242b"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.207641 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-98prp" event={"ID":"4135fc20-df28-4f8d-b244-aedd5ed57cc2","Type":"ContainerStarted","Data":"49a7b38747f8562cd7a4e6e050cec3d81202329adfd053459a59103fb40b4766"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.207746 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" event={"ID":"0ccbef26-8b5f-4b83-885f-3c074207eb73","Type":"ContainerStarted","Data":"198149700e09863ba5824c3da59cdfa277dbef6c16734a23ecc5c937b970d9dc"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.207866 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerStarted","Data":"08fbe88aeb204ddd782e3073f280061d837a707d2c10f9b95b4eb6828823ed41"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.207974 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pttzb" event={"ID":"835f3aad-57da-48e7-ac5a-f0635ee9bc98","Type":"ContainerStarted","Data":"5f8d8cdbe6b7f6fcfb01aff3626cd1ee347d0e895f6332948b61152cbec6e222"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.208101 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67f887b5-9vsz7" event={"ID":"f507e196-94ca-4c4a-91f4-de3587084d30","Type":"ContainerStarted","Data":"47668775c33692efd81b7bc7a0bfacb000692e6da118d403a58db7fb6562c979"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.434657 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7967bcbb45-bl6c8"] Mar 18 09:22:56 crc kubenswrapper[4778]: W0318 09:22:56.460340 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56a6e416_3b49_4f07_a2ac_7fd1a4f58fe2.slice/crio-0cbfa7c707107fef7fba1585919c6375feaacdc7b31da820c5b759e556860c71 WatchSource:0}: Error finding container 0cbfa7c707107fef7fba1585919c6375feaacdc7b31da820c5b759e556860c71: Status 404 returned error can't find the container with id 0cbfa7c707107fef7fba1585919c6375feaacdc7b31da820c5b759e556860c71 Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.217727 4778 generic.go:334] "Generic (PLEG): container finished" podID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerID="2b8a3d906ce9aeea7f7d488e85e8f58f0a3291f6ef7ed6a10151c8afd47df1d7" exitCode=0 Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.218013 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" event={"ID":"0ccbef26-8b5f-4b83-885f-3c074207eb73","Type":"ContainerDied","Data":"2b8a3d906ce9aeea7f7d488e85e8f58f0a3291f6ef7ed6a10151c8afd47df1d7"} Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.226030 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pttzb" event={"ID":"835f3aad-57da-48e7-ac5a-f0635ee9bc98","Type":"ContainerStarted","Data":"b11383263b8711730419b8679c1b55038dd677778bfced742aebd89591fb64cf"} Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.227936 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7967bcbb45-bl6c8" event={"ID":"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2","Type":"ContainerStarted","Data":"0cbfa7c707107fef7fba1585919c6375feaacdc7b31da820c5b759e556860c71"} Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.231151 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zbghp" event={"ID":"4d42905f-c189-4021-834d-f2a81dae5a4a","Type":"ContainerStarted","Data":"abc212acc9fea22cd31581d6e2bb923603370c1ccdef0851c69537c07eedf089"} Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.252058 4778 generic.go:334] "Generic (PLEG): container finished" podID="45f70c00-938c-4f67-9c5b-4a88c90b62ae" containerID="31f5aba8b630d38b7c7e2d0b285363c945e3c560fad8314e046d23e6b99ea03e" exitCode=0 Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.252131 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" event={"ID":"45f70c00-938c-4f67-9c5b-4a88c90b62ae","Type":"ContainerDied","Data":"31f5aba8b630d38b7c7e2d0b285363c945e3c560fad8314e046d23e6b99ea03e"} Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.277399 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pttzb" podStartSLOduration=4.277374507 podStartE2EDuration="4.277374507s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:57.265393901 +0000 UTC m=+1243.840138751" watchObservedRunningTime="2026-03-18 09:22:57.277374507 +0000 UTC m=+1243.852119347" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.283157 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-zbghp" podStartSLOduration=4.283144124 podStartE2EDuration="4.283144124s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:57.281551082 +0000 UTC m=+1243.856295932" watchObservedRunningTime="2026-03-18 09:22:57.283144124 +0000 UTC m=+1243.857888964" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.801029 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.826439 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-config\") pod \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.826532 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-dns-svc\") pod \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.827070 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npsqv\" (UniqueName: \"kubernetes.io/projected/45f70c00-938c-4f67-9c5b-4a88c90b62ae-kube-api-access-npsqv\") pod \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.827163 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-nb\") pod \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.851846 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-config" (OuterVolumeSpecName: "config") pod "45f70c00-938c-4f67-9c5b-4a88c90b62ae" (UID: "45f70c00-938c-4f67-9c5b-4a88c90b62ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.860031 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45f70c00-938c-4f67-9c5b-4a88c90b62ae" (UID: "45f70c00-938c-4f67-9c5b-4a88c90b62ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.861521 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45f70c00-938c-4f67-9c5b-4a88c90b62ae" (UID: "45f70c00-938c-4f67-9c5b-4a88c90b62ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.863221 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f70c00-938c-4f67-9c5b-4a88c90b62ae-kube-api-access-npsqv" (OuterVolumeSpecName: "kube-api-access-npsqv") pod "45f70c00-938c-4f67-9c5b-4a88c90b62ae" (UID: "45f70c00-938c-4f67-9c5b-4a88c90b62ae"). InnerVolumeSpecName "kube-api-access-npsqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.929341 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-sb\") pod \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.929818 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.929830 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.929839 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npsqv\" (UniqueName: \"kubernetes.io/projected/45f70c00-938c-4f67-9c5b-4a88c90b62ae-kube-api-access-npsqv\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.929849 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.958482 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45f70c00-938c-4f67-9c5b-4a88c90b62ae" (UID: "45f70c00-938c-4f67-9c5b-4a88c90b62ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.033058 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.262155 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" event={"ID":"0ccbef26-8b5f-4b83-885f-3c074207eb73","Type":"ContainerStarted","Data":"b36c483c2b509d6056dce716a1f9836c549822a2372786d1c313cd9da431b608"} Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.263138 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.264737 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" event={"ID":"45f70c00-938c-4f67-9c5b-4a88c90b62ae","Type":"ContainerDied","Data":"2ad9afed41c247bfddfd9b123a9c47b45acee2887be9ed80d8c3249a1334c51d"} Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.264770 4778 scope.go:117] "RemoveContainer" containerID="31f5aba8b630d38b7c7e2d0b285363c945e3c560fad8314e046d23e6b99ea03e" Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.264799 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.285995 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" podStartSLOduration=5.28596118 podStartE2EDuration="5.28596118s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:58.282356022 +0000 UTC m=+1244.857100872" watchObservedRunningTime="2026-03-18 09:22:58.28596118 +0000 UTC m=+1244.860706030" Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.359588 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-m2fhf"] Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.396593 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-m2fhf"] Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.147809 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.148150 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.148211 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.148872 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7154b13c7f4cd2402d0304ed7b86d22cac3e7b544f6222d5ad0d8d8ac0463fe2"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.148924 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://7154b13c7f4cd2402d0304ed7b86d22cac3e7b544f6222d5ad0d8d8ac0463fe2" gracePeriod=600 Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.196557 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f70c00-938c-4f67-9c5b-4a88c90b62ae" path="/var/lib/kubelet/pods/45f70c00-938c-4f67-9c5b-4a88c90b62ae/volumes" Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.290909 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="7154b13c7f4cd2402d0304ed7b86d22cac3e7b544f6222d5ad0d8d8ac0463fe2" exitCode=0 Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.291005 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"7154b13c7f4cd2402d0304ed7b86d22cac3e7b544f6222d5ad0d8d8ac0463fe2"} Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.291072 4778 scope.go:117] "RemoveContainer" containerID="ea61f21841c4e8176f6c189293fbb254c950982318bec3d46909c1b0e41c2f38" Mar 18 09:23:01 crc kubenswrapper[4778]: I0318 09:23:01.305501 4778 generic.go:334] "Generic (PLEG): container finished" podID="835f3aad-57da-48e7-ac5a-f0635ee9bc98" containerID="b11383263b8711730419b8679c1b55038dd677778bfced742aebd89591fb64cf" exitCode=0 Mar 18 09:23:01 crc kubenswrapper[4778]: I0318 09:23:01.305943 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pttzb" event={"ID":"835f3aad-57da-48e7-ac5a-f0635ee9bc98","Type":"ContainerDied","Data":"b11383263b8711730419b8679c1b55038dd677778bfced742aebd89591fb64cf"} Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.342032 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67f887b5-9vsz7"] Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.409008 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-99c8bfc86-rldfg"] Mar 18 09:23:02 crc kubenswrapper[4778]: E0318 09:23:02.409430 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f70c00-938c-4f67-9c5b-4a88c90b62ae" containerName="init" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.409448 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f70c00-938c-4f67-9c5b-4a88c90b62ae" containerName="init" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.409622 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f70c00-938c-4f67-9c5b-4a88c90b62ae" containerName="init" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.410559 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.416242 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.418838 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-99c8bfc86-rldfg"] Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.498034 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7967bcbb45-bl6c8"] Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-config-data\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524350 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1f8a48-d595-4f4e-a740-0af5a26397a5-logs\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524400 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-secret-key\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524434 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-tls-certs\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524550 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-combined-ca-bundle\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524594 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpxvx\" (UniqueName: \"kubernetes.io/projected/ea1f8a48-d595-4f4e-a740-0af5a26397a5-kube-api-access-wpxvx\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524632 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-scripts\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.538260 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-644f48df4-b7jhq"] Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.539695 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.556781 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-644f48df4-b7jhq"] Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.626638 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-tls-certs\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.627475 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a0a638-c445-4931-861e-d35704487c97-logs\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.627522 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-combined-ca-bundle\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.627540 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxvx\" (UniqueName: \"kubernetes.io/projected/ea1f8a48-d595-4f4e-a740-0af5a26397a5-kube-api-access-wpxvx\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.627557 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0a0a638-c445-4931-861e-d35704487c97-config-data\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.627577 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-scripts\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-scripts\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628411 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-combined-ca-bundle\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628517 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-horizon-tls-certs\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628620 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-horizon-secret-key\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628758 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-config-data\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628792 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dx2q\" (UniqueName: \"kubernetes.io/projected/e0a0a638-c445-4931-861e-d35704487c97-kube-api-access-5dx2q\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628830 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1f8a48-d595-4f4e-a740-0af5a26397a5-logs\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628915 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0a0a638-c445-4931-861e-d35704487c97-scripts\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.629017 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-secret-key\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.630030 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1f8a48-d595-4f4e-a740-0af5a26397a5-logs\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.630111 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-config-data\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.649275 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-combined-ca-bundle\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.649377 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-secret-key\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.649476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-tls-certs\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.652452 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpxvx\" (UniqueName: \"kubernetes.io/projected/ea1f8a48-d595-4f4e-a740-0af5a26397a5-kube-api-access-wpxvx\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-horizon-tls-certs\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731120 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-horizon-secret-key\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731223 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dx2q\" (UniqueName: \"kubernetes.io/projected/e0a0a638-c445-4931-861e-d35704487c97-kube-api-access-5dx2q\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731268 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0a0a638-c445-4931-861e-d35704487c97-scripts\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731344 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a0a638-c445-4931-861e-d35704487c97-logs\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731398 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0a0a638-c445-4931-861e-d35704487c97-config-data\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731433 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-combined-ca-bundle\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731768 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a0a638-c445-4931-861e-d35704487c97-logs\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.733181 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0a0a638-c445-4931-861e-d35704487c97-scripts\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.734074 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0a0a638-c445-4931-861e-d35704487c97-config-data\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.736664 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-combined-ca-bundle\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.737989 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-horizon-secret-key\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.739511 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-horizon-tls-certs\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.755026 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dx2q\" (UniqueName: \"kubernetes.io/projected/e0a0a638-c445-4931-861e-d35704487c97-kube-api-access-5dx2q\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.782160 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.858425 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:04 crc kubenswrapper[4778]: I0318 09:23:04.161397 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:23:04 crc kubenswrapper[4778]: I0318 09:23:04.248026 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-npjfc"] Mar 18 09:23:04 crc kubenswrapper[4778]: I0318 09:23:04.252168 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" containerID="cri-o://28ca12d3a58a03f2fd89d9677d86730c400a702b9f62ca98b238e2b7a92046fd" gracePeriod=10 Mar 18 09:23:05 crc kubenswrapper[4778]: I0318 09:23:05.352386 4778 generic.go:334] "Generic (PLEG): container finished" podID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerID="28ca12d3a58a03f2fd89d9677d86730c400a702b9f62ca98b238e2b7a92046fd" exitCode=0 Mar 18 09:23:05 crc kubenswrapper[4778]: I0318 09:23:05.352458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" event={"ID":"4a490b75-6853-41f7-b5b3-46243c4c2166","Type":"ContainerDied","Data":"28ca12d3a58a03f2fd89d9677d86730c400a702b9f62ca98b238e2b7a92046fd"} Mar 18 09:23:08 crc kubenswrapper[4778]: I0318 09:23:08.561582 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.080864 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.170004 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-combined-ca-bundle\") pod \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.171264 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9lph\" (UniqueName: \"kubernetes.io/projected/835f3aad-57da-48e7-ac5a-f0635ee9bc98-kube-api-access-x9lph\") pod \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.171423 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-fernet-keys\") pod \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.171487 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-scripts\") pod \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.171575 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-config-data\") pod \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.171722 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-credential-keys\") pod \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.181753 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-scripts" (OuterVolumeSpecName: "scripts") pod "835f3aad-57da-48e7-ac5a-f0635ee9bc98" (UID: "835f3aad-57da-48e7-ac5a-f0635ee9bc98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.182050 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/835f3aad-57da-48e7-ac5a-f0635ee9bc98-kube-api-access-x9lph" (OuterVolumeSpecName: "kube-api-access-x9lph") pod "835f3aad-57da-48e7-ac5a-f0635ee9bc98" (UID: "835f3aad-57da-48e7-ac5a-f0635ee9bc98"). InnerVolumeSpecName "kube-api-access-x9lph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.182131 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "835f3aad-57da-48e7-ac5a-f0635ee9bc98" (UID: "835f3aad-57da-48e7-ac5a-f0635ee9bc98"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.186686 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "835f3aad-57da-48e7-ac5a-f0635ee9bc98" (UID: "835f3aad-57da-48e7-ac5a-f0635ee9bc98"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.199750 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "835f3aad-57da-48e7-ac5a-f0635ee9bc98" (UID: "835f3aad-57da-48e7-ac5a-f0635ee9bc98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.205775 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-config-data" (OuterVolumeSpecName: "config-data") pod "835f3aad-57da-48e7-ac5a-f0635ee9bc98" (UID: "835f3aad-57da-48e7-ac5a-f0635ee9bc98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.274042 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.274085 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.274096 4778 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.274109 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.274119 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9lph\" (UniqueName: \"kubernetes.io/projected/835f3aad-57da-48e7-ac5a-f0635ee9bc98-kube-api-access-x9lph\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.274131 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.923242 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pttzb" event={"ID":"835f3aad-57da-48e7-ac5a-f0635ee9bc98","Type":"ContainerDied","Data":"5f8d8cdbe6b7f6fcfb01aff3626cd1ee347d0e895f6332948b61152cbec6e222"} Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.923292 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f8d8cdbe6b7f6fcfb01aff3626cd1ee347d0e895f6332948b61152cbec6e222" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.923376 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.162919 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pttzb"] Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.174486 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pttzb"] Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.203959 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="835f3aad-57da-48e7-ac5a-f0635ee9bc98" path="/var/lib/kubelet/pods/835f3aad-57da-48e7-ac5a-f0635ee9bc98/volumes" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.266934 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5drhw"] Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.267442 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="835f3aad-57da-48e7-ac5a-f0635ee9bc98" containerName="keystone-bootstrap" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.267469 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="835f3aad-57da-48e7-ac5a-f0635ee9bc98" containerName="keystone-bootstrap" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.267700 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="835f3aad-57da-48e7-ac5a-f0635ee9bc98" containerName="keystone-bootstrap" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.268286 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.272531 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.272852 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.273151 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.273396 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.273516 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vcgh4" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.281536 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5drhw"] Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.300303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-scripts\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.300368 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-config-data\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.300388 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-combined-ca-bundle\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.300475 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfjxd\" (UniqueName: \"kubernetes.io/projected/0fb26926-fc81-4024-a0fa-2363d8703d72-kube-api-access-cfjxd\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.300513 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-fernet-keys\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.300547 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-credential-keys\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.404080 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-config-data\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.404125 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-combined-ca-bundle\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.404172 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfjxd\" (UniqueName: \"kubernetes.io/projected/0fb26926-fc81-4024-a0fa-2363d8703d72-kube-api-access-cfjxd\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.404214 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-fernet-keys\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.404242 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-credential-keys\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.404311 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-scripts\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.410743 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-config-data\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.411634 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-scripts\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.414356 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-combined-ca-bundle\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.418636 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-credential-keys\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.420936 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-fernet-keys\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.425369 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfjxd\" (UniqueName: \"kubernetes.io/projected/0fb26926-fc81-4024-a0fa-2363d8703d72-kube-api-access-cfjxd\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.594612 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.960185 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.960597 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s7f7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-2p9jg_openstack(20eafe8e-c0b9-4463-bc12-8c0cd0359968): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.963445 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-2p9jg" podUID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.970550 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.970749 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n685h65h586h67fh5d8h674h65bh54h5c7h7dh9fh659h7ch658h5b7h8bh684h66fh5dh74h68ch645h95h65dh5d6h67ch67h95h5f4h5c5h56fh56cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-psmpb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-568467c8dc-v4vlb_openstack(14a62749-8336-4894-a162-85350096aef4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.973051 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-568467c8dc-v4vlb" podUID="14a62749-8336-4894-a162-85350096aef4" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.992919 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.993129 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n9dh669h688h8h5ddh697hc4h594h644h598h5b7h574h545h89h7chfdh87hf4h579hd5h5b9h688h597h5f4h554h64dh5d9h66bhf6h7fh579h587q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gzl9p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-67f887b5-9vsz7_openstack(f507e196-94ca-4c4a-91f4-de3587084d30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.996264 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-67f887b5-9vsz7" podUID="f507e196-94ca-4c4a-91f4-de3587084d30" Mar 18 09:23:11 crc kubenswrapper[4778]: E0318 09:23:11.944397 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-2p9jg" podUID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" Mar 18 09:23:13 crc kubenswrapper[4778]: I0318 09:23:13.562154 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Mar 18 09:23:13 crc kubenswrapper[4778]: E0318 09:23:13.737362 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 18 09:23:13 crc kubenswrapper[4778]: E0318 09:23:13.737553 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7gnxr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-98prp_openstack(4135fc20-df28-4f8d-b244-aedd5ed57cc2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:23:13 crc kubenswrapper[4778]: E0318 09:23:13.738735 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-98prp" podUID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" Mar 18 09:23:13 crc kubenswrapper[4778]: E0318 09:23:13.964354 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-98prp" podUID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" Mar 18 09:23:15 crc kubenswrapper[4778]: I0318 09:23:15.981803 4778 generic.go:334] "Generic (PLEG): container finished" podID="4d42905f-c189-4021-834d-f2a81dae5a4a" containerID="abc212acc9fea22cd31581d6e2bb923603370c1ccdef0851c69537c07eedf089" exitCode=0 Mar 18 09:23:15 crc kubenswrapper[4778]: I0318 09:23:15.981976 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zbghp" event={"ID":"4d42905f-c189-4021-834d-f2a81dae5a4a","Type":"ContainerDied","Data":"abc212acc9fea22cd31581d6e2bb923603370c1ccdef0851c69537c07eedf089"} Mar 18 09:23:18 crc kubenswrapper[4778]: I0318 09:23:18.562298 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Mar 18 09:23:18 crc kubenswrapper[4778]: I0318 09:23:18.563015 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.427470 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.438802 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.448616 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zbghp" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.488686 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-scripts\") pod \"14a62749-8336-4894-a162-85350096aef4\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.488798 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a62749-8336-4894-a162-85350096aef4-logs\") pod \"14a62749-8336-4894-a162-85350096aef4\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.488825 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14a62749-8336-4894-a162-85350096aef4-horizon-secret-key\") pod \"14a62749-8336-4894-a162-85350096aef4\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.488872 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psmpb\" (UniqueName: \"kubernetes.io/projected/14a62749-8336-4894-a162-85350096aef4-kube-api-access-psmpb\") pod \"14a62749-8336-4894-a162-85350096aef4\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.488896 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-config-data\") pod \"14a62749-8336-4894-a162-85350096aef4\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.490093 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-config-data" (OuterVolumeSpecName: "config-data") pod "14a62749-8336-4894-a162-85350096aef4" (UID: "14a62749-8336-4894-a162-85350096aef4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.490243 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-scripts" (OuterVolumeSpecName: "scripts") pod "14a62749-8336-4894-a162-85350096aef4" (UID: "14a62749-8336-4894-a162-85350096aef4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.490890 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a62749-8336-4894-a162-85350096aef4-logs" (OuterVolumeSpecName: "logs") pod "14a62749-8336-4894-a162-85350096aef4" (UID: "14a62749-8336-4894-a162-85350096aef4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.496962 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a62749-8336-4894-a162-85350096aef4-kube-api-access-psmpb" (OuterVolumeSpecName: "kube-api-access-psmpb") pod "14a62749-8336-4894-a162-85350096aef4" (UID: "14a62749-8336-4894-a162-85350096aef4"). InnerVolumeSpecName "kube-api-access-psmpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.497930 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a62749-8336-4894-a162-85350096aef4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "14a62749-8336-4894-a162-85350096aef4" (UID: "14a62749-8336-4894-a162-85350096aef4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.590762 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f507e196-94ca-4c4a-91f4-de3587084d30-horizon-secret-key\") pod \"f507e196-94ca-4c4a-91f4-de3587084d30\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591003 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-config\") pod \"4d42905f-c189-4021-834d-f2a81dae5a4a\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591038 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-scripts\") pod \"f507e196-94ca-4c4a-91f4-de3587084d30\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591094 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzl9p\" (UniqueName: \"kubernetes.io/projected/f507e196-94ca-4c4a-91f4-de3587084d30-kube-api-access-gzl9p\") pod \"f507e196-94ca-4c4a-91f4-de3587084d30\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591272 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-combined-ca-bundle\") pod \"4d42905f-c189-4021-834d-f2a81dae5a4a\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591304 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcdfb\" (UniqueName: \"kubernetes.io/projected/4d42905f-c189-4021-834d-f2a81dae5a4a-kube-api-access-lcdfb\") pod \"4d42905f-c189-4021-834d-f2a81dae5a4a\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591328 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-config-data\") pod \"f507e196-94ca-4c4a-91f4-de3587084d30\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591398 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f507e196-94ca-4c4a-91f4-de3587084d30-logs\") pod \"f507e196-94ca-4c4a-91f4-de3587084d30\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591920 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591946 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a62749-8336-4894-a162-85350096aef4-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591957 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14a62749-8336-4894-a162-85350096aef4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591972 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psmpb\" (UniqueName: \"kubernetes.io/projected/14a62749-8336-4894-a162-85350096aef4-kube-api-access-psmpb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591982 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.593015 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f507e196-94ca-4c4a-91f4-de3587084d30-logs" (OuterVolumeSpecName: "logs") pod "f507e196-94ca-4c4a-91f4-de3587084d30" (UID: "f507e196-94ca-4c4a-91f4-de3587084d30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.593415 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-scripts" (OuterVolumeSpecName: "scripts") pod "f507e196-94ca-4c4a-91f4-de3587084d30" (UID: "f507e196-94ca-4c4a-91f4-de3587084d30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.597295 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f507e196-94ca-4c4a-91f4-de3587084d30-kube-api-access-gzl9p" (OuterVolumeSpecName: "kube-api-access-gzl9p") pod "f507e196-94ca-4c4a-91f4-de3587084d30" (UID: "f507e196-94ca-4c4a-91f4-de3587084d30"). InnerVolumeSpecName "kube-api-access-gzl9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.597642 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-config-data" (OuterVolumeSpecName: "config-data") pod "f507e196-94ca-4c4a-91f4-de3587084d30" (UID: "f507e196-94ca-4c4a-91f4-de3587084d30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.597877 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d42905f-c189-4021-834d-f2a81dae5a4a-kube-api-access-lcdfb" (OuterVolumeSpecName: "kube-api-access-lcdfb") pod "4d42905f-c189-4021-834d-f2a81dae5a4a" (UID: "4d42905f-c189-4021-834d-f2a81dae5a4a"). InnerVolumeSpecName "kube-api-access-lcdfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.599448 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f507e196-94ca-4c4a-91f4-de3587084d30-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f507e196-94ca-4c4a-91f4-de3587084d30" (UID: "f507e196-94ca-4c4a-91f4-de3587084d30"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.619142 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d42905f-c189-4021-834d-f2a81dae5a4a" (UID: "4d42905f-c189-4021-834d-f2a81dae5a4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.620275 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-config" (OuterVolumeSpecName: "config") pod "4d42905f-c189-4021-834d-f2a81dae5a4a" (UID: "4d42905f-c189-4021-834d-f2a81dae5a4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694243 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694310 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694324 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzl9p\" (UniqueName: \"kubernetes.io/projected/f507e196-94ca-4c4a-91f4-de3587084d30-kube-api-access-gzl9p\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694339 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694350 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcdfb\" (UniqueName: \"kubernetes.io/projected/4d42905f-c189-4021-834d-f2a81dae5a4a-kube-api-access-lcdfb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694358 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694366 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f507e196-94ca-4c4a-91f4-de3587084d30-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694377 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f507e196-94ca-4c4a-91f4-de3587084d30-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.054931 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-568467c8dc-v4vlb" event={"ID":"14a62749-8336-4894-a162-85350096aef4","Type":"ContainerDied","Data":"72adaf6b6c1593294d8954a516b806ad25d5b5421c025d726c18c87806d6bbc1"} Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.054959 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.060721 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.060734 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67f887b5-9vsz7" event={"ID":"f507e196-94ca-4c4a-91f4-de3587084d30","Type":"ContainerDied","Data":"47668775c33692efd81b7bc7a0bfacb000692e6da118d403a58db7fb6562c979"} Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.070605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zbghp" event={"ID":"4d42905f-c189-4021-834d-f2a81dae5a4a","Type":"ContainerDied","Data":"0ae8af7be7deced401f81be079f01e4b10cb21029c529a4299ba7b28d3003f40"} Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.070647 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ae8af7be7deced401f81be079f01e4b10cb21029c529a4299ba7b28d3003f40" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.070691 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zbghp" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.157873 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-568467c8dc-v4vlb"] Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.167065 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-568467c8dc-v4vlb"] Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.187513 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67f887b5-9vsz7"] Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.194569 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67f887b5-9vsz7"] Mar 18 09:23:23 crc kubenswrapper[4778]: E0318 09:23:23.625525 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 18 09:23:23 crc kubenswrapper[4778]: E0318 09:23:23.626141 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c9z5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-jb4ss_openstack(fba399d9-71ac-41c3-912f-32ccc7fc6190): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:23:23 crc kubenswrapper[4778]: E0318 09:23:23.627355 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-jb4ss" podUID="fba399d9-71ac-41c3-912f-32ccc7fc6190" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.789744 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-p2knr"] Mar 18 09:23:23 crc kubenswrapper[4778]: E0318 09:23:23.790692 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d42905f-c189-4021-834d-f2a81dae5a4a" containerName="neutron-db-sync" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.790709 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d42905f-c189-4021-834d-f2a81dae5a4a" containerName="neutron-db-sync" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.790915 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d42905f-c189-4021-834d-f2a81dae5a4a" containerName="neutron-db-sync" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.791851 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.815438 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.867701 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-p2knr"] Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.910288 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cffc84f44-vtx7x"] Mar 18 09:23:23 crc kubenswrapper[4778]: E0318 09:23:23.910762 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.910776 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" Mar 18 09:23:23 crc kubenswrapper[4778]: E0318 09:23:23.910786 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="init" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.910792 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="init" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.911004 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.911901 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.915260 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.915787 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.916035 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tvhb7" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.920740 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.927267 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cffc84f44-vtx7x"] Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.928316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-sb\") pod \"4a490b75-6853-41f7-b5b3-46243c4c2166\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.928385 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-config\") pod \"4a490b75-6853-41f7-b5b3-46243c4c2166\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.928506 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-dns-svc\") pod \"4a490b75-6853-41f7-b5b3-46243c4c2166\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.928564 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-nb\") pod \"4a490b75-6853-41f7-b5b3-46243c4c2166\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.928717 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68vb9\" (UniqueName: \"kubernetes.io/projected/4a490b75-6853-41f7-b5b3-46243c4c2166-kube-api-access-68vb9\") pod \"4a490b75-6853-41f7-b5b3-46243c4c2166\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.932864 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.932918 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.933016 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.933041 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-config\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.933081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mdv2\" (UniqueName: \"kubernetes.io/projected/ae2c97b2-c699-443a-b3b3-ecb22de258c2-kube-api-access-8mdv2\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.937424 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a490b75-6853-41f7-b5b3-46243c4c2166-kube-api-access-68vb9" (OuterVolumeSpecName: "kube-api-access-68vb9") pod "4a490b75-6853-41f7-b5b3-46243c4c2166" (UID: "4a490b75-6853-41f7-b5b3-46243c4c2166"). InnerVolumeSpecName "kube-api-access-68vb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034271 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-ovndb-tls-certs\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034328 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034354 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-config\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034373 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-httpd-config\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034402 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mdv2\" (UniqueName: \"kubernetes.io/projected/ae2c97b2-c699-443a-b3b3-ecb22de258c2-kube-api-access-8mdv2\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034457 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034491 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034510 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-combined-ca-bundle\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034544 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-config\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034593 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnk27\" (UniqueName: \"kubernetes.io/projected/3d1d399f-3c89-4aaa-bba1-ce1a91358455-kube-api-access-qnk27\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034632 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68vb9\" (UniqueName: \"kubernetes.io/projected/4a490b75-6853-41f7-b5b3-46243c4c2166-kube-api-access-68vb9\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.036112 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.037028 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-config\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.037062 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.037082 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.062067 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mdv2\" (UniqueName: \"kubernetes.io/projected/ae2c97b2-c699-443a-b3b3-ecb22de258c2-kube-api-access-8mdv2\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.082307 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7967bcbb45-bl6c8" event={"ID":"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2","Type":"ContainerStarted","Data":"6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a"} Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.084786 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.085449 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" event={"ID":"4a490b75-6853-41f7-b5b3-46243c4c2166","Type":"ContainerDied","Data":"5ee3b13ece4d9acc78176c498a90576572d6699436ce526238a6b4027ba90016"} Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.085503 4778 scope.go:117] "RemoveContainer" containerID="28ca12d3a58a03f2fd89d9677d86730c400a702b9f62ca98b238e2b7a92046fd" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.091545 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"84af870879787d3b66c88494739078983637b86ff0a5bba3998c98c4487f8853"} Mar 18 09:23:24 crc kubenswrapper[4778]: E0318 09:23:24.097572 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-jb4ss" podUID="fba399d9-71ac-41c3-912f-32ccc7fc6190" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.110831 4778 scope.go:117] "RemoveContainer" containerID="434cdf2be80022d070ec54085d25f3459b4f9eebfed357f2ab22cbeac663278b" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.136993 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-combined-ca-bundle\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.137058 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-config\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.137120 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnk27\" (UniqueName: \"kubernetes.io/projected/3d1d399f-3c89-4aaa-bba1-ce1a91358455-kube-api-access-qnk27\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.137142 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-ovndb-tls-certs\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.137173 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-httpd-config\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.147827 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-ovndb-tls-certs\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.153596 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-combined-ca-bundle\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.156344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-httpd-config\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.176539 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5drhw"] Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.179148 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnk27\" (UniqueName: \"kubernetes.io/projected/3d1d399f-3c89-4aaa-bba1-ce1a91358455-kube-api-access-qnk27\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.179515 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-config\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.190940 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.206658 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a490b75-6853-41f7-b5b3-46243c4c2166" (UID: "4a490b75-6853-41f7-b5b3-46243c4c2166"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.207171 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a490b75-6853-41f7-b5b3-46243c4c2166" (UID: "4a490b75-6853-41f7-b5b3-46243c4c2166"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.210377 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a490b75-6853-41f7-b5b3-46243c4c2166" (UID: "4a490b75-6853-41f7-b5b3-46243c4c2166"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.217136 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.219139 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-config" (OuterVolumeSpecName: "config") pod "4a490b75-6853-41f7-b5b3-46243c4c2166" (UID: "4a490b75-6853-41f7-b5b3-46243c4c2166"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.227655 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a62749-8336-4894-a162-85350096aef4" path="/var/lib/kubelet/pods/14a62749-8336-4894-a162-85350096aef4/volumes" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.228323 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f507e196-94ca-4c4a-91f4-de3587084d30" path="/var/lib/kubelet/pods/f507e196-94ca-4c4a-91f4-de3587084d30/volumes" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.238251 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.239605 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.239636 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.239651 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.239663 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.267393 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-644f48df4-b7jhq"] Mar 18 09:23:24 crc kubenswrapper[4778]: W0318 09:23:24.288129 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a0a638_c445_4931_861e_d35704487c97.slice/crio-0fbce59dc7e1c7d23f86e5641e55e03088590e21a72f00c1960f4c85cfb15726 WatchSource:0}: Error finding container 0fbce59dc7e1c7d23f86e5641e55e03088590e21a72f00c1960f4c85cfb15726: Status 404 returned error can't find the container with id 0fbce59dc7e1c7d23f86e5641e55e03088590e21a72f00c1960f4c85cfb15726 Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.385438 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-99c8bfc86-rldfg"] Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.505076 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-npjfc"] Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.512078 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-npjfc"] Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.798602 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-p2knr"] Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.124866 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" event={"ID":"ae2c97b2-c699-443a-b3b3-ecb22de258c2","Type":"ContainerStarted","Data":"548920e3510d285dcfc8978dccbf1745880c114136f2e4109ae6de9a628d0437"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.146268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5drhw" event={"ID":"0fb26926-fc81-4024-a0fa-2363d8703d72","Type":"ContainerStarted","Data":"a9c5d9789a0793c932c41c72d2058c14c6dc506dbd6a4bb8ed76c0353ce8bcc2"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.146337 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5drhw" event={"ID":"0fb26926-fc81-4024-a0fa-2363d8703d72","Type":"ContainerStarted","Data":"2609a43b731af1cf4cb19221492cd2a7688e79a1ecb7ef9c42072e9a7bc39519"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.154378 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7967bcbb45-bl6c8" event={"ID":"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2","Type":"ContainerStarted","Data":"42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.154534 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7967bcbb45-bl6c8" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon-log" containerID="cri-o://6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a" gracePeriod=30 Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.154782 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7967bcbb45-bl6c8" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon" containerID="cri-o://42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec" gracePeriod=30 Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.173023 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5drhw" podStartSLOduration=15.173000172 podStartE2EDuration="15.173000172s" podCreationTimestamp="2026-03-18 09:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:25.167514612 +0000 UTC m=+1271.742259452" watchObservedRunningTime="2026-03-18 09:23:25.173000172 +0000 UTC m=+1271.747745012" Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.182774 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-99c8bfc86-rldfg" event={"ID":"ea1f8a48-d595-4f4e-a740-0af5a26397a5","Type":"ContainerStarted","Data":"d66d99949ad5120620bffa75ea4a3e49aee89967a841972aa06a4a9867cff673"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.182822 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-99c8bfc86-rldfg" event={"ID":"ea1f8a48-d595-4f4e-a740-0af5a26397a5","Type":"ContainerStarted","Data":"c77fd4278a90c239273c01a79ef12824477ebf4a1fc89be85a96364b2e982560"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.186450 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-644f48df4-b7jhq" event={"ID":"e0a0a638-c445-4931-861e-d35704487c97","Type":"ContainerStarted","Data":"602f232dbc5842b7eda582b514327d9043c8239c9d2c5f6be5f9b82563f5c319"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.186487 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-644f48df4-b7jhq" event={"ID":"e0a0a638-c445-4931-861e-d35704487c97","Type":"ContainerStarted","Data":"0fbce59dc7e1c7d23f86e5641e55e03088590e21a72f00c1960f4c85cfb15726"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.188340 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerStarted","Data":"7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.198434 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7967bcbb45-bl6c8" podStartSLOduration=3.13577453 podStartE2EDuration="30.198418414s" podCreationTimestamp="2026-03-18 09:22:55 +0000 UTC" firstStartedPulling="2026-03-18 09:22:56.480425769 +0000 UTC m=+1243.055170609" lastFinishedPulling="2026-03-18 09:23:23.543069613 +0000 UTC m=+1270.117814493" observedRunningTime="2026-03-18 09:23:25.197491689 +0000 UTC m=+1271.772236539" watchObservedRunningTime="2026-03-18 09:23:25.198418414 +0000 UTC m=+1271.773163254" Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.472167 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cffc84f44-vtx7x"] Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.784453 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.208784 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" path="/var/lib/kubelet/pods/4a490b75-6853-41f7-b5b3-46243c4c2166/volumes" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.217569 4778 generic.go:334] "Generic (PLEG): container finished" podID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerID="a2a49acb877ac3d5291587410f4bdad35a27b8e7dc386fa78d21020a20cbe78c" exitCode=0 Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.217648 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" event={"ID":"ae2c97b2-c699-443a-b3b3-ecb22de258c2","Type":"ContainerDied","Data":"a2a49acb877ac3d5291587410f4bdad35a27b8e7dc386fa78d21020a20cbe78c"} Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.230600 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cffc84f44-vtx7x" event={"ID":"3d1d399f-3c89-4aaa-bba1-ce1a91358455","Type":"ContainerStarted","Data":"e444d25da091179b3622d7408ec0b6e7caa7c81b27414dca1d4252c8b3fb5441"} Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.234214 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-99c8bfc86-rldfg" event={"ID":"ea1f8a48-d595-4f4e-a740-0af5a26397a5","Type":"ContainerStarted","Data":"c97b0574c01538fa1c2084f0fe2463a0e870b56db44f51730acba68dbd0ebf25"} Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.245162 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-644f48df4-b7jhq" event={"ID":"e0a0a638-c445-4931-861e-d35704487c97","Type":"ContainerStarted","Data":"338c9caa4dcb4c9e2071a164b7da01c5e6507793b93b5cb80dcaa9454afdb9e2"} Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.289725 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-644f48df4-b7jhq" podStartSLOduration=24.28970075 podStartE2EDuration="24.28970075s" podCreationTimestamp="2026-03-18 09:23:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:26.27761755 +0000 UTC m=+1272.852362420" watchObservedRunningTime="2026-03-18 09:23:26.28970075 +0000 UTC m=+1272.864445610" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.317378 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-99c8bfc86-rldfg" podStartSLOduration=24.317355063 podStartE2EDuration="24.317355063s" podCreationTimestamp="2026-03-18 09:23:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:26.30181366 +0000 UTC m=+1272.876558520" watchObservedRunningTime="2026-03-18 09:23:26.317355063 +0000 UTC m=+1272.892099893" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.412332 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56b9647d87-2qhmh"] Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.413631 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.423496 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.423911 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.437609 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56b9647d87-2qhmh"] Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.519726 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-combined-ca-bundle\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.520115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-public-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.520144 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-ovndb-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.520170 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-config\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.520203 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb97b\" (UniqueName: \"kubernetes.io/projected/2b12c496-12d8-47e5-8cb7-134c3860368d-kube-api-access-qb97b\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.520259 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-httpd-config\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.520427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-internal-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622531 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-combined-ca-bundle\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622595 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-public-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622633 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-ovndb-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622667 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-config\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622695 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb97b\" (UniqueName: \"kubernetes.io/projected/2b12c496-12d8-47e5-8cb7-134c3860368d-kube-api-access-qb97b\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622964 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-httpd-config\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622993 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-internal-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.629850 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-public-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.630364 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-httpd-config\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.632007 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-ovndb-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.632528 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-internal-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.635000 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-combined-ca-bundle\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.638963 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-config\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.646871 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb97b\" (UniqueName: \"kubernetes.io/projected/2b12c496-12d8-47e5-8cb7-134c3860368d-kube-api-access-qb97b\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.887859 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.260727 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2p9jg" event={"ID":"20eafe8e-c0b9-4463-bc12-8c0cd0359968","Type":"ContainerStarted","Data":"43cd9dd19c1bcb6dd251052b9f5e2f4fd14b11fc891320649bdbfdb44d9ca171"} Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.269241 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cffc84f44-vtx7x" event={"ID":"3d1d399f-3c89-4aaa-bba1-ce1a91358455","Type":"ContainerStarted","Data":"6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3"} Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.269283 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cffc84f44-vtx7x" event={"ID":"3d1d399f-3c89-4aaa-bba1-ce1a91358455","Type":"ContainerStarted","Data":"5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827"} Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.269691 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.273674 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerStarted","Data":"d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053"} Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.283083 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" event={"ID":"ae2c97b2-c699-443a-b3b3-ecb22de258c2","Type":"ContainerStarted","Data":"14bdd0eaf1dbaf2dd412641a551329bb6a350b27bbb5a0fca3b8cd2009883565"} Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.316354 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2p9jg" podStartSLOduration=3.47383832 podStartE2EDuration="34.316324664s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="2026-03-18 09:22:55.93637278 +0000 UTC m=+1242.511117620" lastFinishedPulling="2026-03-18 09:23:26.778859124 +0000 UTC m=+1273.353603964" observedRunningTime="2026-03-18 09:23:27.282121272 +0000 UTC m=+1273.856866112" watchObservedRunningTime="2026-03-18 09:23:27.316324664 +0000 UTC m=+1273.891069504" Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.317178 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cffc84f44-vtx7x" podStartSLOduration=4.317170947 podStartE2EDuration="4.317170947s" podCreationTimestamp="2026-03-18 09:23:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:27.301729206 +0000 UTC m=+1273.876474056" watchObservedRunningTime="2026-03-18 09:23:27.317170947 +0000 UTC m=+1273.891915787" Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.332041 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" podStartSLOduration=4.332020732 podStartE2EDuration="4.332020732s" podCreationTimestamp="2026-03-18 09:23:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:27.32793358 +0000 UTC m=+1273.902678430" watchObservedRunningTime="2026-03-18 09:23:27.332020732 +0000 UTC m=+1273.906765572" Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.761119 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56b9647d87-2qhmh"] Mar 18 09:23:27 crc kubenswrapper[4778]: W0318 09:23:27.791883 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b12c496_12d8_47e5_8cb7_134c3860368d.slice/crio-e8df8ad489699b62f3719756fb074fcb6bc53de4a98faae3059fd18d69bd24b5 WatchSource:0}: Error finding container e8df8ad489699b62f3719756fb074fcb6bc53de4a98faae3059fd18d69bd24b5: Status 404 returned error can't find the container with id e8df8ad489699b62f3719756fb074fcb6bc53de4a98faae3059fd18d69bd24b5 Mar 18 09:23:28 crc kubenswrapper[4778]: I0318 09:23:28.334182 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b9647d87-2qhmh" event={"ID":"2b12c496-12d8-47e5-8cb7-134c3860368d","Type":"ContainerStarted","Data":"1b28dfe7d2202acd51cd4556f1c17c3749cac234f100a94b65f9772c04f97b62"} Mar 18 09:23:28 crc kubenswrapper[4778]: I0318 09:23:28.334728 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:28 crc kubenswrapper[4778]: I0318 09:23:28.334743 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b9647d87-2qhmh" event={"ID":"2b12c496-12d8-47e5-8cb7-134c3860368d","Type":"ContainerStarted","Data":"e8df8ad489699b62f3719756fb074fcb6bc53de4a98faae3059fd18d69bd24b5"} Mar 18 09:23:28 crc kubenswrapper[4778]: I0318 09:23:28.566498 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Mar 18 09:23:29 crc kubenswrapper[4778]: I0318 09:23:29.346254 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b9647d87-2qhmh" event={"ID":"2b12c496-12d8-47e5-8cb7-134c3860368d","Type":"ContainerStarted","Data":"97b1e09d6053130fb7ed8cd3a7a1b57d88faf0c15aceaa656cf6b72bf7bed517"} Mar 18 09:23:29 crc kubenswrapper[4778]: I0318 09:23:29.346962 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:29 crc kubenswrapper[4778]: I0318 09:23:29.378011 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56b9647d87-2qhmh" podStartSLOduration=3.377991952 podStartE2EDuration="3.377991952s" podCreationTimestamp="2026-03-18 09:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:29.370771096 +0000 UTC m=+1275.945515936" watchObservedRunningTime="2026-03-18 09:23:29.377991952 +0000 UTC m=+1275.952736792" Mar 18 09:23:30 crc kubenswrapper[4778]: I0318 09:23:30.358304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-98prp" event={"ID":"4135fc20-df28-4f8d-b244-aedd5ed57cc2","Type":"ContainerStarted","Data":"f3f45a8c98da9f26aefaed6f12fcf1ccfeb1c540f04357427bf8f13c5c12ad79"} Mar 18 09:23:30 crc kubenswrapper[4778]: I0318 09:23:30.373720 4778 generic.go:334] "Generic (PLEG): container finished" podID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" containerID="43cd9dd19c1bcb6dd251052b9f5e2f4fd14b11fc891320649bdbfdb44d9ca171" exitCode=0 Mar 18 09:23:30 crc kubenswrapper[4778]: I0318 09:23:30.373804 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2p9jg" event={"ID":"20eafe8e-c0b9-4463-bc12-8c0cd0359968","Type":"ContainerDied","Data":"43cd9dd19c1bcb6dd251052b9f5e2f4fd14b11fc891320649bdbfdb44d9ca171"} Mar 18 09:23:30 crc kubenswrapper[4778]: I0318 09:23:30.388262 4778 generic.go:334] "Generic (PLEG): container finished" podID="0fb26926-fc81-4024-a0fa-2363d8703d72" containerID="a9c5d9789a0793c932c41c72d2058c14c6dc506dbd6a4bb8ed76c0353ce8bcc2" exitCode=0 Mar 18 09:23:30 crc kubenswrapper[4778]: I0318 09:23:30.389038 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5drhw" event={"ID":"0fb26926-fc81-4024-a0fa-2363d8703d72","Type":"ContainerDied","Data":"a9c5d9789a0793c932c41c72d2058c14c6dc506dbd6a4bb8ed76c0353ce8bcc2"} Mar 18 09:23:30 crc kubenswrapper[4778]: I0318 09:23:30.389697 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-98prp" podStartSLOduration=3.631364581 podStartE2EDuration="37.3896886s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="2026-03-18 09:22:55.961414632 +0000 UTC m=+1242.536159472" lastFinishedPulling="2026-03-18 09:23:29.719738651 +0000 UTC m=+1276.294483491" observedRunningTime="2026-03-18 09:23:30.388655382 +0000 UTC m=+1276.963400242" watchObservedRunningTime="2026-03-18 09:23:30.3896886 +0000 UTC m=+1276.964433440" Mar 18 09:23:32 crc kubenswrapper[4778]: I0318 09:23:32.783286 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:32 crc kubenswrapper[4778]: I0318 09:23:32.784363 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:32 crc kubenswrapper[4778]: I0318 09:23:32.860066 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:32 crc kubenswrapper[4778]: I0318 09:23:32.860137 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:34 crc kubenswrapper[4778]: I0318 09:23:34.205010 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:34 crc kubenswrapper[4778]: I0318 09:23:34.317921 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-nz24n"] Mar 18 09:23:34 crc kubenswrapper[4778]: I0318 09:23:34.318321 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerName="dnsmasq-dns" containerID="cri-o://b36c483c2b509d6056dce716a1f9836c549822a2372786d1c313cd9da431b608" gracePeriod=10 Mar 18 09:23:34 crc kubenswrapper[4778]: I0318 09:23:34.450061 4778 generic.go:334] "Generic (PLEG): container finished" podID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" containerID="f3f45a8c98da9f26aefaed6f12fcf1ccfeb1c540f04357427bf8f13c5c12ad79" exitCode=0 Mar 18 09:23:34 crc kubenswrapper[4778]: I0318 09:23:34.450245 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-98prp" event={"ID":"4135fc20-df28-4f8d-b244-aedd5ed57cc2","Type":"ContainerDied","Data":"f3f45a8c98da9f26aefaed6f12fcf1ccfeb1c540f04357427bf8f13c5c12ad79"} Mar 18 09:23:35 crc kubenswrapper[4778]: I0318 09:23:35.465908 4778 generic.go:334] "Generic (PLEG): container finished" podID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerID="b36c483c2b509d6056dce716a1f9836c549822a2372786d1c313cd9da431b608" exitCode=0 Mar 18 09:23:35 crc kubenswrapper[4778]: I0318 09:23:35.465990 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" event={"ID":"0ccbef26-8b5f-4b83-885f-3c074207eb73","Type":"ContainerDied","Data":"b36c483c2b509d6056dce716a1f9836c549822a2372786d1c313cd9da431b608"} Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.665747 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2p9jg" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.684540 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-98prp" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731562 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-scripts\") pod \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731622 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-combined-ca-bundle\") pod \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731661 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-combined-ca-bundle\") pod \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731739 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-config-data\") pod \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731809 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7f7g\" (UniqueName: \"kubernetes.io/projected/20eafe8e-c0b9-4463-bc12-8c0cd0359968-kube-api-access-s7f7g\") pod \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731852 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-db-sync-config-data\") pod \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731884 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gnxr\" (UniqueName: \"kubernetes.io/projected/4135fc20-df28-4f8d-b244-aedd5ed57cc2-kube-api-access-7gnxr\") pod \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731913 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20eafe8e-c0b9-4463-bc12-8c0cd0359968-logs\") pod \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.733046 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20eafe8e-c0b9-4463-bc12-8c0cd0359968-logs" (OuterVolumeSpecName: "logs") pod "20eafe8e-c0b9-4463-bc12-8c0cd0359968" (UID: "20eafe8e-c0b9-4463-bc12-8c0cd0359968"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.745274 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4135fc20-df28-4f8d-b244-aedd5ed57cc2-kube-api-access-7gnxr" (OuterVolumeSpecName: "kube-api-access-7gnxr") pod "4135fc20-df28-4f8d-b244-aedd5ed57cc2" (UID: "4135fc20-df28-4f8d-b244-aedd5ed57cc2"). InnerVolumeSpecName "kube-api-access-7gnxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.746739 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4135fc20-df28-4f8d-b244-aedd5ed57cc2" (UID: "4135fc20-df28-4f8d-b244-aedd5ed57cc2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.758422 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-scripts" (OuterVolumeSpecName: "scripts") pod "20eafe8e-c0b9-4463-bc12-8c0cd0359968" (UID: "20eafe8e-c0b9-4463-bc12-8c0cd0359968"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.773189 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20eafe8e-c0b9-4463-bc12-8c0cd0359968-kube-api-access-s7f7g" (OuterVolumeSpecName: "kube-api-access-s7f7g") pod "20eafe8e-c0b9-4463-bc12-8c0cd0359968" (UID: "20eafe8e-c0b9-4463-bc12-8c0cd0359968"). InnerVolumeSpecName "kube-api-access-s7f7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.789092 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4135fc20-df28-4f8d-b244-aedd5ed57cc2" (UID: "4135fc20-df28-4f8d-b244-aedd5ed57cc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.796407 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-config-data" (OuterVolumeSpecName: "config-data") pod "20eafe8e-c0b9-4463-bc12-8c0cd0359968" (UID: "20eafe8e-c0b9-4463-bc12-8c0cd0359968"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837515 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837550 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837561 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837570 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7f7g\" (UniqueName: \"kubernetes.io/projected/20eafe8e-c0b9-4463-bc12-8c0cd0359968-kube-api-access-s7f7g\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837582 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837591 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gnxr\" (UniqueName: \"kubernetes.io/projected/4135fc20-df28-4f8d-b244-aedd5ed57cc2-kube-api-access-7gnxr\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837599 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20eafe8e-c0b9-4463-bc12-8c0cd0359968-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.853089 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20eafe8e-c0b9-4463-bc12-8c0cd0359968" (UID: "20eafe8e-c0b9-4463-bc12-8c0cd0359968"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.854842 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.946376 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.964009 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.052698 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lwww\" (UniqueName: \"kubernetes.io/projected/0ccbef26-8b5f-4b83-885f-3c074207eb73-kube-api-access-4lwww\") pod \"0ccbef26-8b5f-4b83-885f-3c074207eb73\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.052753 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-nb\") pod \"0ccbef26-8b5f-4b83-885f-3c074207eb73\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.052794 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-sb\") pod \"0ccbef26-8b5f-4b83-885f-3c074207eb73\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.052884 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-credential-keys\") pod \"0fb26926-fc81-4024-a0fa-2363d8703d72\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.052914 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-scripts\") pod \"0fb26926-fc81-4024-a0fa-2363d8703d72\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.053256 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-dns-svc\") pod \"0ccbef26-8b5f-4b83-885f-3c074207eb73\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.053290 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-fernet-keys\") pod \"0fb26926-fc81-4024-a0fa-2363d8703d72\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.053331 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-config-data\") pod \"0fb26926-fc81-4024-a0fa-2363d8703d72\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.053365 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfjxd\" (UniqueName: \"kubernetes.io/projected/0fb26926-fc81-4024-a0fa-2363d8703d72-kube-api-access-cfjxd\") pod \"0fb26926-fc81-4024-a0fa-2363d8703d72\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.053386 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-config\") pod \"0ccbef26-8b5f-4b83-885f-3c074207eb73\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.053400 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-combined-ca-bundle\") pod \"0fb26926-fc81-4024-a0fa-2363d8703d72\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.060130 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-scripts" (OuterVolumeSpecName: "scripts") pod "0fb26926-fc81-4024-a0fa-2363d8703d72" (UID: "0fb26926-fc81-4024-a0fa-2363d8703d72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.060892 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ccbef26-8b5f-4b83-885f-3c074207eb73-kube-api-access-4lwww" (OuterVolumeSpecName: "kube-api-access-4lwww") pod "0ccbef26-8b5f-4b83-885f-3c074207eb73" (UID: "0ccbef26-8b5f-4b83-885f-3c074207eb73"). InnerVolumeSpecName "kube-api-access-4lwww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.063979 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb26926-fc81-4024-a0fa-2363d8703d72-kube-api-access-cfjxd" (OuterVolumeSpecName: "kube-api-access-cfjxd") pod "0fb26926-fc81-4024-a0fa-2363d8703d72" (UID: "0fb26926-fc81-4024-a0fa-2363d8703d72"). InnerVolumeSpecName "kube-api-access-cfjxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.079001 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0fb26926-fc81-4024-a0fa-2363d8703d72" (UID: "0fb26926-fc81-4024-a0fa-2363d8703d72"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.079189 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0fb26926-fc81-4024-a0fa-2363d8703d72" (UID: "0fb26926-fc81-4024-a0fa-2363d8703d72"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.093899 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fb26926-fc81-4024-a0fa-2363d8703d72" (UID: "0fb26926-fc81-4024-a0fa-2363d8703d72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.094036 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-config-data" (OuterVolumeSpecName: "config-data") pod "0fb26926-fc81-4024-a0fa-2363d8703d72" (UID: "0fb26926-fc81-4024-a0fa-2363d8703d72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.113384 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ccbef26-8b5f-4b83-885f-3c074207eb73" (UID: "0ccbef26-8b5f-4b83-885f-3c074207eb73"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.114145 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ccbef26-8b5f-4b83-885f-3c074207eb73" (UID: "0ccbef26-8b5f-4b83-885f-3c074207eb73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.114897 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ccbef26-8b5f-4b83-885f-3c074207eb73" (UID: "0ccbef26-8b5f-4b83-885f-3c074207eb73"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.116369 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-config" (OuterVolumeSpecName: "config") pod "0ccbef26-8b5f-4b83-885f-3c074207eb73" (UID: "0ccbef26-8b5f-4b83-885f-3c074207eb73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154486 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154529 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154541 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154552 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfjxd\" (UniqueName: \"kubernetes.io/projected/0fb26926-fc81-4024-a0fa-2363d8703d72-kube-api-access-cfjxd\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154608 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154620 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154632 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lwww\" (UniqueName: \"kubernetes.io/projected/0ccbef26-8b5f-4b83-885f-3c074207eb73-kube-api-access-4lwww\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154641 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154649 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154660 4778 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154668 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.488886 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" event={"ID":"0ccbef26-8b5f-4b83-885f-3c074207eb73","Type":"ContainerDied","Data":"198149700e09863ba5824c3da59cdfa277dbef6c16734a23ecc5c937b970d9dc"} Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.488946 4778 scope.go:117] "RemoveContainer" containerID="b36c483c2b509d6056dce716a1f9836c549822a2372786d1c313cd9da431b608" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.489087 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.498790 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerStarted","Data":"19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4"} Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.502523 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2p9jg" event={"ID":"20eafe8e-c0b9-4463-bc12-8c0cd0359968","Type":"ContainerDied","Data":"f3040164c08e605fe47fcb403b7a54ffe6495d3760996218e88d3582673d242b"} Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.502573 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2p9jg" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.502577 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3040164c08e605fe47fcb403b7a54ffe6495d3760996218e88d3582673d242b" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.507664 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.507694 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5drhw" event={"ID":"0fb26926-fc81-4024-a0fa-2363d8703d72","Type":"ContainerDied","Data":"2609a43b731af1cf4cb19221492cd2a7688e79a1ecb7ef9c42072e9a7bc39519"} Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.507739 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2609a43b731af1cf4cb19221492cd2a7688e79a1ecb7ef9c42072e9a7bc39519" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.510183 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-98prp" event={"ID":"4135fc20-df28-4f8d-b244-aedd5ed57cc2","Type":"ContainerDied","Data":"49a7b38747f8562cd7a4e6e050cec3d81202329adfd053459a59103fb40b4766"} Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.510244 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49a7b38747f8562cd7a4e6e050cec3d81202329adfd053459a59103fb40b4766" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.510277 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-98prp" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.514619 4778 scope.go:117] "RemoveContainer" containerID="2b8a3d906ce9aeea7f7d488e85e8f58f0a3291f6ef7ed6a10151c8afd47df1d7" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.536054 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-nz24n"] Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.544564 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-nz24n"] Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.849672 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7b57877776-ssjzt"] Mar 18 09:23:37 crc kubenswrapper[4778]: E0318 09:23:37.852133 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerName="dnsmasq-dns" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.852260 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerName="dnsmasq-dns" Mar 18 09:23:37 crc kubenswrapper[4778]: E0318 09:23:37.852344 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" containerName="placement-db-sync" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.852403 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" containerName="placement-db-sync" Mar 18 09:23:37 crc kubenswrapper[4778]: E0318 09:23:37.852459 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" containerName="barbican-db-sync" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.852509 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" containerName="barbican-db-sync" Mar 18 09:23:37 crc kubenswrapper[4778]: E0318 09:23:37.852568 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerName="init" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.852617 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerName="init" Mar 18 09:23:37 crc kubenswrapper[4778]: E0318 09:23:37.852670 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb26926-fc81-4024-a0fa-2363d8703d72" containerName="keystone-bootstrap" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.852716 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb26926-fc81-4024-a0fa-2363d8703d72" containerName="keystone-bootstrap" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.852941 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerName="dnsmasq-dns" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.853017 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" containerName="placement-db-sync" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.853085 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" containerName="barbican-db-sync" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.853147 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb26926-fc81-4024-a0fa-2363d8703d72" containerName="keystone-bootstrap" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.854424 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.856214 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4lsnj" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.858949 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.871772 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b57877776-ssjzt"] Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.873991 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.874362 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.875842 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5e9a8c-649d-4d20-b867-bb0f801d329d-logs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880349 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-scripts\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880388 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-combined-ca-bundle\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880421 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-internal-tls-certs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880443 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg9cl\" (UniqueName: \"kubernetes.io/projected/9c5e9a8c-649d-4d20-b867-bb0f801d329d-kube-api-access-jg9cl\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880477 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-public-tls-certs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880583 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-config-data\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.982090 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-config-data\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.982928 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5e9a8c-649d-4d20-b867-bb0f801d329d-logs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.982965 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-scripts\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.983023 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-combined-ca-bundle\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.983067 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-internal-tls-certs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.983095 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9cl\" (UniqueName: \"kubernetes.io/projected/9c5e9a8c-649d-4d20-b867-bb0f801d329d-kube-api-access-jg9cl\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.983138 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-public-tls-certs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.986670 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5e9a8c-649d-4d20-b867-bb0f801d329d-logs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.987870 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-combined-ca-bundle\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.997925 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-internal-tls-certs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.999831 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-scripts\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.000530 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-75996d8fd4-jhtd2"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.002429 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.025776 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-75996d8fd4-jhtd2"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.035174 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-public-tls-certs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.035684 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.035912 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vcgh4" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.035950 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.036084 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.036138 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-config-data\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.036826 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.037248 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.040948 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg9cl\" (UniqueName: \"kubernetes.io/projected/9c5e9a8c-649d-4d20-b867-bb0f801d329d-kube-api-access-jg9cl\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094105 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-scripts\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094155 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-config-data\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094224 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-fernet-keys\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094250 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-internal-tls-certs\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094355 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-combined-ca-bundle\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094385 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-public-tls-certs\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094453 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44cx2\" (UniqueName: \"kubernetes.io/projected/4c045639-00d0-4ba6-9d75-c67934521e29-kube-api-access-44cx2\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094474 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-credential-keys\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.134663 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-679c749775-5x4dx"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.159696 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-679c749775-5x4dx"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.159885 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.187874 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.188392 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-66r47" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.188567 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.195289 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-scripts\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.195355 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-combined-ca-bundle\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.195377 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-config-data\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.195605 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2290443f-8279-4b62-9d3d-bab0be1d7af5-logs\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.195637 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-fernet-keys\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.197220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-internal-tls-certs\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.197640 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.198000 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-combined-ca-bundle\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.198027 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-public-tls-certs\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.198262 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksngs\" (UniqueName: \"kubernetes.io/projected/2290443f-8279-4b62-9d3d-bab0be1d7af5-kube-api-access-ksngs\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.198285 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data-custom\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.198501 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44cx2\" (UniqueName: \"kubernetes.io/projected/4c045639-00d0-4ba6-9d75-c67934521e29-kube-api-access-44cx2\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.198756 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-credential-keys\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.214722 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.228102 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-config-data\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.228646 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-internal-tls-certs\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.243185 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-scripts\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.244854 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-public-tls-certs\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.244865 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-credential-keys\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.249338 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-combined-ca-bundle\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.251175 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-fernet-keys\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.259898 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44cx2\" (UniqueName: \"kubernetes.io/projected/4c045639-00d0-4ba6-9d75-c67934521e29-kube-api-access-44cx2\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.262159 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" path="/var/lib/kubelet/pods/0ccbef26-8b5f-4b83-885f-3c074207eb73/volumes" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.275346 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.297262 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.298045 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.300102 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-combined-ca-bundle\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.313125 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.300147 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2290443f-8279-4b62-9d3d-bab0be1d7af5-logs\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.313657 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.313850 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksngs\" (UniqueName: \"kubernetes.io/projected/2290443f-8279-4b62-9d3d-bab0be1d7af5-kube-api-access-ksngs\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.313887 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data-custom\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.313967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2290443f-8279-4b62-9d3d-bab0be1d7af5-logs\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.323842 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.333724 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-combined-ca-bundle\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.335333 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869f779d85-2zrdr"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.335820 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data-custom\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.336909 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.375754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksngs\" (UniqueName: \"kubernetes.io/projected/2290443f-8279-4b62-9d3d-bab0be1d7af5-kube-api-access-ksngs\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.421322 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-2zrdr"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422024 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-config\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422058 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422094 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422128 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422163 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-combined-ca-bundle\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422184 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-dns-svc\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422221 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6bt4\" (UniqueName: \"kubernetes.io/projected/3bdaed74-310d-4589-9a14-8c862f05d378-kube-api-access-h6bt4\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422255 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data-custom\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422282 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrb4m\" (UniqueName: \"kubernetes.io/projected/1cc8be70-e875-4307-89a5-9cbb0d105b86-kube-api-access-nrb4m\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422304 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bdaed74-310d-4589-9a14-8c862f05d378-logs\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422456 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.520659 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.524916 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data-custom\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525008 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrb4m\" (UniqueName: \"kubernetes.io/projected/1cc8be70-e875-4307-89a5-9cbb0d105b86-kube-api-access-nrb4m\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525048 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bdaed74-310d-4589-9a14-8c862f05d378-logs\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525112 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-config\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525141 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525189 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525264 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525324 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-combined-ca-bundle\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525358 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-dns-svc\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525396 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6bt4\" (UniqueName: \"kubernetes.io/projected/3bdaed74-310d-4589-9a14-8c862f05d378-kube-api-access-h6bt4\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.527819 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.528508 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.528538 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bdaed74-310d-4589-9a14-8c862f05d378-logs\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.529342 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-dns-svc\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.529998 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-config\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.542278 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5897f75bc4-n8b2b"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.543959 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.552369 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.575689 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data-custom\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.586627 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5897f75bc4-n8b2b"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.590572 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.596567 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-combined-ca-bundle\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.611255 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6bt4\" (UniqueName: \"kubernetes.io/projected/3bdaed74-310d-4589-9a14-8c862f05d378-kube-api-access-h6bt4\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.611600 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrb4m\" (UniqueName: \"kubernetes.io/projected/1cc8be70-e875-4307-89a5-9cbb0d105b86-kube-api-access-nrb4m\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.639319 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.710755 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.731249 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-769d964c9f-nxhk2"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.732941 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.752832 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753145 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-combined-ca-bundle\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753331 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-logs\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753511 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbwbt\" (UniqueName: \"kubernetes.io/projected/d5e59fa4-b6e8-4091-aedf-46c624304111-kube-api-access-lbwbt\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753632 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-config-data-custom\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753734 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-config-data\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753829 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29zfd\" (UniqueName: \"kubernetes.io/projected/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-kube-api-access-29zfd\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753922 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e59fa4-b6e8-4091-aedf-46c624304111-logs\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.754038 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data-custom\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.754135 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-combined-ca-bundle\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.803339 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8bc77f476-tw7vd"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.804912 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.824024 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-769d964c9f-nxhk2"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.862971 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8bc77f476-tw7vd"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881024 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a006670-1a48-4421-8471-dd961c0e1d4c-logs\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881095 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-logs\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881175 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsq9c\" (UniqueName: \"kubernetes.io/projected/3a006670-1a48-4421-8471-dd961c0e1d4c-kube-api-access-xsq9c\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881218 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbwbt\" (UniqueName: \"kubernetes.io/projected/d5e59fa4-b6e8-4091-aedf-46c624304111-kube-api-access-lbwbt\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881270 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-config-data-custom\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881314 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-config-data\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881349 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29zfd\" (UniqueName: \"kubernetes.io/projected/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-kube-api-access-29zfd\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e59fa4-b6e8-4091-aedf-46c624304111-logs\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881432 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data-custom\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881473 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-combined-ca-bundle\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881508 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-combined-ca-bundle\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881571 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881649 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-config-data\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881674 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-config-data-custom\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881714 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-combined-ca-bundle\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.883513 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-logs\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.887579 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e59fa4-b6e8-4091-aedf-46c624304111-logs\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.911428 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-config-data-custom\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.912769 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-combined-ca-bundle\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.926865 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-combined-ca-bundle\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.927893 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data-custom\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.929136 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbwbt\" (UniqueName: \"kubernetes.io/projected/d5e59fa4-b6e8-4091-aedf-46c624304111-kube-api-access-lbwbt\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.958792 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.961980 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-config-data\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.962614 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29zfd\" (UniqueName: \"kubernetes.io/projected/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-kube-api-access-29zfd\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.000026 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-combined-ca-bundle\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.000333 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-config-data\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.000403 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-config-data-custom\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.000586 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a006670-1a48-4421-8471-dd961c0e1d4c-logs\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.000734 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsq9c\" (UniqueName: \"kubernetes.io/projected/3a006670-1a48-4421-8471-dd961c0e1d4c-kube-api-access-xsq9c\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.010842 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-combined-ca-bundle\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.010940 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a006670-1a48-4421-8471-dd961c0e1d4c-logs\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.019325 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-config-data-custom\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.020743 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-config-data\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.023892 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsq9c\" (UniqueName: \"kubernetes.io/projected/3a006670-1a48-4421-8471-dd961c0e1d4c-kube-api-access-xsq9c\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.025358 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-df89bff66-xp7n4"] Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.027113 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.036665 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-df89bff66-xp7n4"] Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.112571 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.163886 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.164322 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.210143 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data-custom\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.210309 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm7dv\" (UniqueName: \"kubernetes.io/projected/48e51d3c-7f9f-4196-b708-c28c0e7477fa-kube-api-access-qm7dv\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.210623 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.210717 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-combined-ca-bundle\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.210773 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48e51d3c-7f9f-4196-b708-c28c0e7477fa-logs\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.314331 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.314532 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-combined-ca-bundle\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.314600 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48e51d3c-7f9f-4196-b708-c28c0e7477fa-logs\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.314712 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data-custom\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.314782 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm7dv\" (UniqueName: \"kubernetes.io/projected/48e51d3c-7f9f-4196-b708-c28c0e7477fa-kube-api-access-qm7dv\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.315087 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48e51d3c-7f9f-4196-b708-c28c0e7477fa-logs\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.335840 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-combined-ca-bundle\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.336934 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.346431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data-custom\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.352105 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm7dv\" (UniqueName: \"kubernetes.io/projected/48e51d3c-7f9f-4196-b708-c28c0e7477fa-kube-api-access-qm7dv\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.368633 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.481217 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b57877776-ssjzt"] Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.678621 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-75996d8fd4-jhtd2"] Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.770510 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jb4ss" event={"ID":"fba399d9-71ac-41c3-912f-32ccc7fc6190","Type":"ContainerStarted","Data":"b5325ab7bf5fcc801abe0c67c554c5ab72e440e7503aafe03c45a398c6a12432"} Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.778304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57877776-ssjzt" event={"ID":"9c5e9a8c-649d-4d20-b867-bb0f801d329d","Type":"ContainerStarted","Data":"dcaa7760f0d0e632c657a22054c5f006fe1f82143f10b41d8d1cb108f3a1621b"} Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.785932 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-75996d8fd4-jhtd2" event={"ID":"4c045639-00d0-4ba6-9d75-c67934521e29","Type":"ContainerStarted","Data":"7b8cbb9769a8a5897515734bd6020fdbe13f12d735c496630fe08f1100290cca"} Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.805945 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jb4ss" podStartSLOduration=4.984078028 podStartE2EDuration="46.805921381s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="2026-03-18 09:22:55.883754946 +0000 UTC m=+1242.458499786" lastFinishedPulling="2026-03-18 09:23:37.705598299 +0000 UTC m=+1284.280343139" observedRunningTime="2026-03-18 09:23:39.805404096 +0000 UTC m=+1286.380148936" watchObservedRunningTime="2026-03-18 09:23:39.805921381 +0000 UTC m=+1286.380666221" Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.339383 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-2zrdr"] Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.352279 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-679c749775-5x4dx"] Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.383163 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg"] Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.438997 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5897f75bc4-n8b2b"] Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.463854 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-769d964c9f-nxhk2"] Mar 18 09:23:40 crc kubenswrapper[4778]: W0318 09:23:40.464809 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5e59fa4_b6e8_4091_aedf_46c624304111.slice/crio-59137511add332c1310983740cad93329cdff92e5782f78a0a877c931d4f0027 WatchSource:0}: Error finding container 59137511add332c1310983740cad93329cdff92e5782f78a0a877c931d4f0027: Status 404 returned error can't find the container with id 59137511add332c1310983740cad93329cdff92e5782f78a0a877c931d4f0027 Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.482834 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-df89bff66-xp7n4"] Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.641079 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8bc77f476-tw7vd"] Mar 18 09:23:40 crc kubenswrapper[4778]: W0318 09:23:40.679504 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a006670_1a48_4421_8471_dd961c0e1d4c.slice/crio-046678c0e7673a52b9c1716d4f49b4d9a4c9e6c696c20d2a4fec81f855b0e29a WatchSource:0}: Error finding container 046678c0e7673a52b9c1716d4f49b4d9a4c9e6c696c20d2a4fec81f855b0e29a: Status 404 returned error can't find the container with id 046678c0e7673a52b9c1716d4f49b4d9a4c9e6c696c20d2a4fec81f855b0e29a Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.818223 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679c749775-5x4dx" event={"ID":"2290443f-8279-4b62-9d3d-bab0be1d7af5","Type":"ContainerStarted","Data":"44a3e56960042fea2cef1afef1f593fd3c5991c7f2d6bb676b183bc24c3832e2"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.822144 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-75996d8fd4-jhtd2" event={"ID":"4c045639-00d0-4ba6-9d75-c67934521e29","Type":"ContainerStarted","Data":"0243bfbf668a6a163b2df6e7d0a281e7c7cec943948afd3c039de9ffc7f7f5fc"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.823565 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.828114 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df89bff66-xp7n4" event={"ID":"48e51d3c-7f9f-4196-b708-c28c0e7477fa","Type":"ContainerStarted","Data":"b165bcb9da7478d0edb4f2c1e6142d197b4e354bbdd69e9690efc132cc26e106"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.835747 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" event={"ID":"3bdaed74-310d-4589-9a14-8c862f05d378","Type":"ContainerStarted","Data":"936e07a4b38cfd1b5b28cfe412a7dc4ba1163eceb3ae2617b1ff1b9e372215e3"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.852615 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-75996d8fd4-jhtd2" podStartSLOduration=3.852598651 podStartE2EDuration="3.852598651s" podCreationTimestamp="2026-03-18 09:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:40.84964491 +0000 UTC m=+1287.424389750" watchObservedRunningTime="2026-03-18 09:23:40.852598651 +0000 UTC m=+1287.427343501" Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.875952 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-769d964c9f-nxhk2" event={"ID":"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b","Type":"ContainerStarted","Data":"f7fb335f9e8b41b5f6d83a14f4cf791d4cd409aa6464ca1fb53047a01a82dcdb"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.881489 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f75bc4-n8b2b" event={"ID":"d5e59fa4-b6e8-4091-aedf-46c624304111","Type":"ContainerStarted","Data":"59137511add332c1310983740cad93329cdff92e5782f78a0a877c931d4f0027"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.884747 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57877776-ssjzt" event={"ID":"9c5e9a8c-649d-4d20-b867-bb0f801d329d","Type":"ContainerStarted","Data":"2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.884799 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57877776-ssjzt" event={"ID":"9c5e9a8c-649d-4d20-b867-bb0f801d329d","Type":"ContainerStarted","Data":"8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.884852 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.884910 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.888130 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" event={"ID":"3a006670-1a48-4421-8471-dd961c0e1d4c","Type":"ContainerStarted","Data":"046678c0e7673a52b9c1716d4f49b4d9a4c9e6c696c20d2a4fec81f855b0e29a"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.892656 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" event={"ID":"1cc8be70-e875-4307-89a5-9cbb0d105b86","Type":"ContainerStarted","Data":"cc995beb30e43a9c4d675261cc6a1346b30a1e32f69b240f14fdcb175b6ff16e"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.915274 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7b57877776-ssjzt" podStartSLOduration=3.915248988 podStartE2EDuration="3.915248988s" podCreationTimestamp="2026-03-18 09:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:40.903438266 +0000 UTC m=+1287.478183106" watchObservedRunningTime="2026-03-18 09:23:40.915248988 +0000 UTC m=+1287.489993828" Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.909571 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f75bc4-n8b2b" event={"ID":"d5e59fa4-b6e8-4091-aedf-46c624304111","Type":"ContainerStarted","Data":"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8"} Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.909897 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f75bc4-n8b2b" event={"ID":"d5e59fa4-b6e8-4091-aedf-46c624304111","Type":"ContainerStarted","Data":"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9"} Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.909940 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.909959 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.916060 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df89bff66-xp7n4" event={"ID":"48e51d3c-7f9f-4196-b708-c28c0e7477fa","Type":"ContainerStarted","Data":"f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad"} Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.916092 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df89bff66-xp7n4" event={"ID":"48e51d3c-7f9f-4196-b708-c28c0e7477fa","Type":"ContainerStarted","Data":"3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33"} Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.916884 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.916912 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.929514 4778 generic.go:334] "Generic (PLEG): container finished" podID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerID="5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8" exitCode=0 Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.930946 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" event={"ID":"1cc8be70-e875-4307-89a5-9cbb0d105b86","Type":"ContainerDied","Data":"5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8"} Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.992502 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5897f75bc4-n8b2b"] Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.158144 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5897f75bc4-n8b2b" podStartSLOduration=4.158121342 podStartE2EDuration="4.158121342s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:42.036616503 +0000 UTC m=+1288.611361353" watchObservedRunningTime="2026-03-18 09:23:42.158121342 +0000 UTC m=+1288.732866182" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.180305 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84d7458cd-cb86l"] Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.182679 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.188684 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.188893 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.239780 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84d7458cd-cb86l"] Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.281062 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-df89bff66-xp7n4" podStartSLOduration=4.28102277 podStartE2EDuration="4.28102277s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:42.120830947 +0000 UTC m=+1288.695575797" watchObservedRunningTime="2026-03-18 09:23:42.28102277 +0000 UTC m=+1288.855767610" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348716 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-internal-tls-certs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348772 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-config-data-custom\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348831 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-public-tls-certs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348865 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8wc5\" (UniqueName: \"kubernetes.io/projected/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-kube-api-access-v8wc5\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348914 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-combined-ca-bundle\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348931 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-config-data\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348956 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-logs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450490 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8wc5\" (UniqueName: \"kubernetes.io/projected/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-kube-api-access-v8wc5\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450562 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-combined-ca-bundle\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450581 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-config-data\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450608 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-logs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450665 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-internal-tls-certs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450690 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-config-data-custom\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450736 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-public-tls-certs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.452907 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-logs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.457783 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-internal-tls-certs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.457990 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-config-data-custom\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.459844 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-public-tls-certs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.460386 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-combined-ca-bundle\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.472431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-config-data\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.478371 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8wc5\" (UniqueName: \"kubernetes.io/projected/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-kube-api-access-v8wc5\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.541490 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.784570 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-99c8bfc86-rldfg" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.861305 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-644f48df4-b7jhq" podUID="e0a0a638-c445-4931-861e-d35704487c97" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.948551 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" event={"ID":"1cc8be70-e875-4307-89a5-9cbb0d105b86","Type":"ContainerStarted","Data":"9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed"} Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.949678 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.986595 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" podStartSLOduration=4.986568529 podStartE2EDuration="4.986568529s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:42.979717332 +0000 UTC m=+1289.554462192" watchObservedRunningTime="2026-03-18 09:23:42.986568529 +0000 UTC m=+1289.561313369" Mar 18 09:23:43 crc kubenswrapper[4778]: I0318 09:23:43.908747 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84d7458cd-cb86l"] Mar 18 09:23:43 crc kubenswrapper[4778]: I0318 09:23:43.991476 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-769d964c9f-nxhk2" event={"ID":"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b","Type":"ContainerStarted","Data":"d1cd21f00ad51ee5e120c6ce0764e4fc97e1b7010e1997b14513affbb8ff8d5a"} Mar 18 09:23:43 crc kubenswrapper[4778]: I0318 09:23:43.996323 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679c749775-5x4dx" event={"ID":"2290443f-8279-4b62-9d3d-bab0be1d7af5","Type":"ContainerStarted","Data":"e9f7b0df052a4e1bc6dea1dc19739579966633e37f974824883fab15b09a8f39"} Mar 18 09:23:43 crc kubenswrapper[4778]: I0318 09:23:43.999374 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d7458cd-cb86l" event={"ID":"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55","Type":"ContainerStarted","Data":"54f6da0c089b4edf9a7ae0a3b8e1da1139b6c01d35705d878c7cf5873639f7dd"} Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.002155 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" event={"ID":"3a006670-1a48-4421-8471-dd961c0e1d4c","Type":"ContainerStarted","Data":"5ab76e88dc154ab2acf57f179f9ba99e2ed5d45ca1c01ed21a8ae72223921734"} Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.004298 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5897f75bc4-n8b2b" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api-log" containerID="cri-o://edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9" gracePeriod=30 Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.004613 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" event={"ID":"3bdaed74-310d-4589-9a14-8c862f05d378","Type":"ContainerStarted","Data":"358bb2fa0945b8176753e109d73c3f83c6eac33a4e8d6a66d2281628c5bd5f11"} Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.005853 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5897f75bc4-n8b2b" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api" containerID="cri-o://3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8" gracePeriod=30 Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.698635 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.715030 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbwbt\" (UniqueName: \"kubernetes.io/projected/d5e59fa4-b6e8-4091-aedf-46c624304111-kube-api-access-lbwbt\") pod \"d5e59fa4-b6e8-4091-aedf-46c624304111\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.715078 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data-custom\") pod \"d5e59fa4-b6e8-4091-aedf-46c624304111\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.715124 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e59fa4-b6e8-4091-aedf-46c624304111-logs\") pod \"d5e59fa4-b6e8-4091-aedf-46c624304111\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.715178 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data\") pod \"d5e59fa4-b6e8-4091-aedf-46c624304111\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.715235 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-combined-ca-bundle\") pod \"d5e59fa4-b6e8-4091-aedf-46c624304111\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.717351 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e59fa4-b6e8-4091-aedf-46c624304111-logs" (OuterVolumeSpecName: "logs") pod "d5e59fa4-b6e8-4091-aedf-46c624304111" (UID: "d5e59fa4-b6e8-4091-aedf-46c624304111"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.740657 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e59fa4-b6e8-4091-aedf-46c624304111-kube-api-access-lbwbt" (OuterVolumeSpecName: "kube-api-access-lbwbt") pod "d5e59fa4-b6e8-4091-aedf-46c624304111" (UID: "d5e59fa4-b6e8-4091-aedf-46c624304111"). InnerVolumeSpecName "kube-api-access-lbwbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.740905 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d5e59fa4-b6e8-4091-aedf-46c624304111" (UID: "d5e59fa4-b6e8-4091-aedf-46c624304111"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.771820 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5e59fa4-b6e8-4091-aedf-46c624304111" (UID: "d5e59fa4-b6e8-4091-aedf-46c624304111"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.817879 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbwbt\" (UniqueName: \"kubernetes.io/projected/d5e59fa4-b6e8-4091-aedf-46c624304111-kube-api-access-lbwbt\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.817909 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.817918 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e59fa4-b6e8-4091-aedf-46c624304111-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.817927 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.859300 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data" (OuterVolumeSpecName: "config-data") pod "d5e59fa4-b6e8-4091-aedf-46c624304111" (UID: "d5e59fa4-b6e8-4091-aedf-46c624304111"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.918950 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021630 4778 generic.go:334] "Generic (PLEG): container finished" podID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerID="3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8" exitCode=0 Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021677 4778 generic.go:334] "Generic (PLEG): container finished" podID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerID="edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9" exitCode=143 Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021732 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f75bc4-n8b2b" event={"ID":"d5e59fa4-b6e8-4091-aedf-46c624304111","Type":"ContainerDied","Data":"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021770 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f75bc4-n8b2b" event={"ID":"d5e59fa4-b6e8-4091-aedf-46c624304111","Type":"ContainerDied","Data":"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021783 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f75bc4-n8b2b" event={"ID":"d5e59fa4-b6e8-4091-aedf-46c624304111","Type":"ContainerDied","Data":"59137511add332c1310983740cad93329cdff92e5782f78a0a877c931d4f0027"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021805 4778 scope.go:117] "RemoveContainer" containerID="3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021965 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.042716 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d7458cd-cb86l" event={"ID":"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55","Type":"ContainerStarted","Data":"624c4dffba896b6c96ce6f6e7925b0bf9777ecb41e219b64897938bfc4b24acc"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.042788 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d7458cd-cb86l" event={"ID":"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55","Type":"ContainerStarted","Data":"e27bba0454b6d4077f23c7beda3af3ad09cca815f9a6d526ef38e1f8e940b7d5"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.044549 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.044581 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.062280 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" event={"ID":"3a006670-1a48-4421-8471-dd961c0e1d4c","Type":"ContainerStarted","Data":"07b54bf16d1a9f269c10f038b41bf94b4dec1c7fb718055d08d33e1f4138193a"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.072551 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" event={"ID":"3bdaed74-310d-4589-9a14-8c862f05d378","Type":"ContainerStarted","Data":"e7e9f37ed0458c02870caf56e192acdd710a30f50cce72d6e9f4bfe098996a15"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.096555 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-769d964c9f-nxhk2" event={"ID":"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b","Type":"ContainerStarted","Data":"2bc045235578db48c48f872727668577d24d6ca472d0fd364cb530018321cec2"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.097607 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84d7458cd-cb86l" podStartSLOduration=4.09757719 podStartE2EDuration="4.09757719s" podCreationTimestamp="2026-03-18 09:23:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:45.089800299 +0000 UTC m=+1291.664545139" watchObservedRunningTime="2026-03-18 09:23:45.09757719 +0000 UTC m=+1291.672322030" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.105555 4778 scope.go:117] "RemoveContainer" containerID="edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.118879 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679c749775-5x4dx" event={"ID":"2290443f-8279-4b62-9d3d-bab0be1d7af5","Type":"ContainerStarted","Data":"fa720a6916b9236597d16b19d9dd408cdda0fb22a3b213c1f2335a3530e41d8d"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.120181 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" podStartSLOduration=4.060022811 podStartE2EDuration="7.120159817s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="2026-03-18 09:23:40.398234104 +0000 UTC m=+1286.972978944" lastFinishedPulling="2026-03-18 09:23:43.45837111 +0000 UTC m=+1290.033115950" observedRunningTime="2026-03-18 09:23:45.117576206 +0000 UTC m=+1291.692321036" watchObservedRunningTime="2026-03-18 09:23:45.120159817 +0000 UTC m=+1291.694904657" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.155511 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5897f75bc4-n8b2b"] Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.156720 4778 scope.go:117] "RemoveContainer" containerID="3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8" Mar 18 09:23:45 crc kubenswrapper[4778]: E0318 09:23:45.160605 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8\": container with ID starting with 3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8 not found: ID does not exist" containerID="3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.160655 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8"} err="failed to get container status \"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8\": rpc error: code = NotFound desc = could not find container \"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8\": container with ID starting with 3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8 not found: ID does not exist" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.160685 4778 scope.go:117] "RemoveContainer" containerID="edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9" Mar 18 09:23:45 crc kubenswrapper[4778]: E0318 09:23:45.164283 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9\": container with ID starting with edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9 not found: ID does not exist" containerID="edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.164313 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9"} err="failed to get container status \"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9\": rpc error: code = NotFound desc = could not find container \"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9\": container with ID starting with edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9 not found: ID does not exist" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.164333 4778 scope.go:117] "RemoveContainer" containerID="3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.164451 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5897f75bc4-n8b2b"] Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.167544 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8"} err="failed to get container status \"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8\": rpc error: code = NotFound desc = could not find container \"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8\": container with ID starting with 3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8 not found: ID does not exist" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.167573 4778 scope.go:117] "RemoveContainer" containerID="edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.173326 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9"} err="failed to get container status \"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9\": rpc error: code = NotFound desc = could not find container \"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9\": container with ID starting with edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9 not found: ID does not exist" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.183425 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" podStartSLOduration=4.417797855 podStartE2EDuration="7.183384228s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="2026-03-18 09:23:40.693562279 +0000 UTC m=+1287.268307119" lastFinishedPulling="2026-03-18 09:23:43.459148652 +0000 UTC m=+1290.033893492" observedRunningTime="2026-03-18 09:23:45.173703895 +0000 UTC m=+1291.748448735" watchObservedRunningTime="2026-03-18 09:23:45.183384228 +0000 UTC m=+1291.758129068" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.212334 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-679c749775-5x4dx" podStartSLOduration=4.117483525 podStartE2EDuration="7.212299415s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="2026-03-18 09:23:40.363653853 +0000 UTC m=+1286.938398693" lastFinishedPulling="2026-03-18 09:23:43.458469733 +0000 UTC m=+1290.033214583" observedRunningTime="2026-03-18 09:23:45.198338905 +0000 UTC m=+1291.773083745" watchObservedRunningTime="2026-03-18 09:23:45.212299415 +0000 UTC m=+1291.787044255" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.318451 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg"] Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.335219 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-769d964c9f-nxhk2" podStartSLOduration=4.470206522 podStartE2EDuration="7.335167102s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="2026-03-18 09:23:40.557734399 +0000 UTC m=+1287.132479239" lastFinishedPulling="2026-03-18 09:23:43.422694979 +0000 UTC m=+1289.997439819" observedRunningTime="2026-03-18 09:23:45.247788432 +0000 UTC m=+1291.822533282" watchObservedRunningTime="2026-03-18 09:23:45.335167102 +0000 UTC m=+1291.909911952" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.367970 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-679c749775-5x4dx"] Mar 18 09:23:46 crc kubenswrapper[4778]: I0318 09:23:46.135976 4778 generic.go:334] "Generic (PLEG): container finished" podID="fba399d9-71ac-41c3-912f-32ccc7fc6190" containerID="b5325ab7bf5fcc801abe0c67c554c5ab72e440e7503aafe03c45a398c6a12432" exitCode=0 Mar 18 09:23:46 crc kubenswrapper[4778]: I0318 09:23:46.136304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jb4ss" event={"ID":"fba399d9-71ac-41c3-912f-32ccc7fc6190","Type":"ContainerDied","Data":"b5325ab7bf5fcc801abe0c67c554c5ab72e440e7503aafe03c45a398c6a12432"} Mar 18 09:23:46 crc kubenswrapper[4778]: I0318 09:23:46.198973 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" path="/var/lib/kubelet/pods/d5e59fa4-b6e8-4091-aedf-46c624304111/volumes" Mar 18 09:23:47 crc kubenswrapper[4778]: I0318 09:23:47.147348 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener-log" containerID="cri-o://358bb2fa0945b8176753e109d73c3f83c6eac33a4e8d6a66d2281628c5bd5f11" gracePeriod=30 Mar 18 09:23:47 crc kubenswrapper[4778]: I0318 09:23:47.147462 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener" containerID="cri-o://e7e9f37ed0458c02870caf56e192acdd710a30f50cce72d6e9f4bfe098996a15" gracePeriod=30 Mar 18 09:23:47 crc kubenswrapper[4778]: I0318 09:23:47.147852 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-679c749775-5x4dx" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker-log" containerID="cri-o://e9f7b0df052a4e1bc6dea1dc19739579966633e37f974824883fab15b09a8f39" gracePeriod=30 Mar 18 09:23:47 crc kubenswrapper[4778]: I0318 09:23:47.147857 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-679c749775-5x4dx" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker" containerID="cri-o://fa720a6916b9236597d16b19d9dd408cdda0fb22a3b213c1f2335a3530e41d8d" gracePeriod=30 Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.166448 4778 generic.go:334] "Generic (PLEG): container finished" podID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerID="e9f7b0df052a4e1bc6dea1dc19739579966633e37f974824883fab15b09a8f39" exitCode=143 Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.166495 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679c749775-5x4dx" event={"ID":"2290443f-8279-4b62-9d3d-bab0be1d7af5","Type":"ContainerDied","Data":"e9f7b0df052a4e1bc6dea1dc19739579966633e37f974824883fab15b09a8f39"} Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.176610 4778 generic.go:334] "Generic (PLEG): container finished" podID="3bdaed74-310d-4589-9a14-8c862f05d378" containerID="e7e9f37ed0458c02870caf56e192acdd710a30f50cce72d6e9f4bfe098996a15" exitCode=0 Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.176664 4778 generic.go:334] "Generic (PLEG): container finished" podID="3bdaed74-310d-4589-9a14-8c862f05d378" containerID="358bb2fa0945b8176753e109d73c3f83c6eac33a4e8d6a66d2281628c5bd5f11" exitCode=143 Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.176676 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" event={"ID":"3bdaed74-310d-4589-9a14-8c862f05d378","Type":"ContainerDied","Data":"e7e9f37ed0458c02870caf56e192acdd710a30f50cce72d6e9f4bfe098996a15"} Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.176739 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" event={"ID":"3bdaed74-310d-4589-9a14-8c862f05d378","Type":"ContainerDied","Data":"358bb2fa0945b8176753e109d73c3f83c6eac33a4e8d6a66d2281628c5bd5f11"} Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.713458 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.793120 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-p2knr"] Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.793543 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="dnsmasq-dns" containerID="cri-o://14bdd0eaf1dbaf2dd412641a551329bb6a350b27bbb5a0fca3b8cd2009883565" gracePeriod=10 Mar 18 09:23:49 crc kubenswrapper[4778]: I0318 09:23:49.193633 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Mar 18 09:23:49 crc kubenswrapper[4778]: I0318 09:23:49.197435 4778 generic.go:334] "Generic (PLEG): container finished" podID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerID="fa720a6916b9236597d16b19d9dd408cdda0fb22a3b213c1f2335a3530e41d8d" exitCode=0 Mar 18 09:23:49 crc kubenswrapper[4778]: I0318 09:23:49.197513 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679c749775-5x4dx" event={"ID":"2290443f-8279-4b62-9d3d-bab0be1d7af5","Type":"ContainerDied","Data":"fa720a6916b9236597d16b19d9dd408cdda0fb22a3b213c1f2335a3530e41d8d"} Mar 18 09:23:49 crc kubenswrapper[4778]: I0318 09:23:49.213188 4778 generic.go:334] "Generic (PLEG): container finished" podID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerID="14bdd0eaf1dbaf2dd412641a551329bb6a350b27bbb5a0fca3b8cd2009883565" exitCode=0 Mar 18 09:23:49 crc kubenswrapper[4778]: I0318 09:23:49.213294 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" event={"ID":"ae2c97b2-c699-443a-b3b3-ecb22de258c2","Type":"ContainerDied","Data":"14bdd0eaf1dbaf2dd412641a551329bb6a350b27bbb5a0fca3b8cd2009883565"} Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.137327 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.156838 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-combined-ca-bundle\") pod \"fba399d9-71ac-41c3-912f-32ccc7fc6190\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.156942 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9z5l\" (UniqueName: \"kubernetes.io/projected/fba399d9-71ac-41c3-912f-32ccc7fc6190-kube-api-access-c9z5l\") pod \"fba399d9-71ac-41c3-912f-32ccc7fc6190\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.157050 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-db-sync-config-data\") pod \"fba399d9-71ac-41c3-912f-32ccc7fc6190\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.157132 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fba399d9-71ac-41c3-912f-32ccc7fc6190-etc-machine-id\") pod \"fba399d9-71ac-41c3-912f-32ccc7fc6190\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.157241 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-config-data\") pod \"fba399d9-71ac-41c3-912f-32ccc7fc6190\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.157300 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-scripts\") pod \"fba399d9-71ac-41c3-912f-32ccc7fc6190\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.157618 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fba399d9-71ac-41c3-912f-32ccc7fc6190-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fba399d9-71ac-41c3-912f-32ccc7fc6190" (UID: "fba399d9-71ac-41c3-912f-32ccc7fc6190"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.157873 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fba399d9-71ac-41c3-912f-32ccc7fc6190-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.169241 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fba399d9-71ac-41c3-912f-32ccc7fc6190" (UID: "fba399d9-71ac-41c3-912f-32ccc7fc6190"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.173391 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fba399d9-71ac-41c3-912f-32ccc7fc6190-kube-api-access-c9z5l" (OuterVolumeSpecName: "kube-api-access-c9z5l") pod "fba399d9-71ac-41c3-912f-32ccc7fc6190" (UID: "fba399d9-71ac-41c3-912f-32ccc7fc6190"). InnerVolumeSpecName "kube-api-access-c9z5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.182985 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-scripts" (OuterVolumeSpecName: "scripts") pod "fba399d9-71ac-41c3-912f-32ccc7fc6190" (UID: "fba399d9-71ac-41c3-912f-32ccc7fc6190"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.216521 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fba399d9-71ac-41c3-912f-32ccc7fc6190" (UID: "fba399d9-71ac-41c3-912f-32ccc7fc6190"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.232874 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.248843 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-config-data" (OuterVolumeSpecName: "config-data") pod "fba399d9-71ac-41c3-912f-32ccc7fc6190" (UID: "fba399d9-71ac-41c3-912f-32ccc7fc6190"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.260108 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9z5l\" (UniqueName: \"kubernetes.io/projected/fba399d9-71ac-41c3-912f-32ccc7fc6190-kube-api-access-c9z5l\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.260131 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.260142 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.260151 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.260160 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.318068 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jb4ss" event={"ID":"fba399d9-71ac-41c3-912f-32ccc7fc6190","Type":"ContainerDied","Data":"fef33d3c7f1855b88b205e58a953db1eba3e6979fcc1430ebec66414bb961b50"} Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.318110 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fef33d3c7f1855b88b205e58a953db1eba3e6979fcc1430ebec66414bb961b50" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.381350 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:23:51 crc kubenswrapper[4778]: E0318 09:23:51.381750 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api-log" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.381762 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api-log" Mar 18 09:23:51 crc kubenswrapper[4778]: E0318 09:23:51.381778 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.381784 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api" Mar 18 09:23:51 crc kubenswrapper[4778]: E0318 09:23:51.381797 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba399d9-71ac-41c3-912f-32ccc7fc6190" containerName="cinder-db-sync" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.381802 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba399d9-71ac-41c3-912f-32ccc7fc6190" containerName="cinder-db-sync" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.381981 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api-log" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.382010 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.382017 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fba399d9-71ac-41c3-912f-32ccc7fc6190" containerName="cinder-db-sync" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.382944 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.387439 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.387661 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.388383 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.388516 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r86tv" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.400540 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.461012 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.477853 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-9xln9"] Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.479330 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.491606 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.491653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.491711 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-scripts\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.491741 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.491764 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c30dcf-f49d-430b-a240-aefe036afeeb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.491806 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mnd9\" (UniqueName: \"kubernetes.io/projected/25c30dcf-f49d-430b-a240-aefe036afeeb-kube-api-access-5mnd9\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.498955 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-9xln9"] Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.541053 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595063 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-dns-svc\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-config\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595468 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595492 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595517 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595559 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595582 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-scripts\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hck6z\" (UniqueName: \"kubernetes.io/projected/06a6d934-2f47-4628-a328-6ba9cefb8090-kube-api-access-hck6z\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595627 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595646 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c30dcf-f49d-430b-a240-aefe036afeeb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595676 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mnd9\" (UniqueName: \"kubernetes.io/projected/25c30dcf-f49d-430b-a240-aefe036afeeb-kube-api-access-5mnd9\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.603346 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c30dcf-f49d-430b-a240-aefe036afeeb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.607042 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.607618 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.613221 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.616857 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mnd9\" (UniqueName: \"kubernetes.io/projected/25c30dcf-f49d-430b-a240-aefe036afeeb-kube-api-access-5mnd9\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.634732 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-scripts\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.696848 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.696910 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.696966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hck6z\" (UniqueName: \"kubernetes.io/projected/06a6d934-2f47-4628-a328-6ba9cefb8090-kube-api-access-hck6z\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.697046 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-dns-svc\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.697072 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-config\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.698967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.699557 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.700349 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-dns-svc\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.703559 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-config\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.711666 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.715754 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.717750 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.730439 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hck6z\" (UniqueName: \"kubernetes.io/projected/06a6d934-2f47-4628-a328-6ba9cefb8090-kube-api-access-hck6z\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.736109 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.761778 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800314 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800367 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800387 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-scripts\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ef4b958-769d-43c2-91d4-3a6cb76d3851-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vxxv\" (UniqueName: \"kubernetes.io/projected/7ef4b958-769d-43c2-91d4-3a6cb76d3851-kube-api-access-7vxxv\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800451 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef4b958-769d-43c2-91d4-3a6cb76d3851-logs\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800529 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.833807 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902190 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-scripts\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902241 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ef4b958-769d-43c2-91d4-3a6cb76d3851-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902281 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vxxv\" (UniqueName: \"kubernetes.io/projected/7ef4b958-769d-43c2-91d4-3a6cb76d3851-kube-api-access-7vxxv\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902300 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef4b958-769d-43c2-91d4-3a6cb76d3851-logs\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902383 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902438 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902462 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.903516 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef4b958-769d-43c2-91d4-3a6cb76d3851-logs\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.903577 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ef4b958-769d-43c2-91d4-3a6cb76d3851-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.907081 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.907108 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.909776 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-scripts\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.914140 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.924770 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vxxv\" (UniqueName: \"kubernetes.io/projected/7ef4b958-769d-43c2-91d4-3a6cb76d3851-kube-api-access-7vxxv\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:52 crc kubenswrapper[4778]: I0318 09:23:52.115190 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 09:23:53 crc kubenswrapper[4778]: I0318 09:23:53.647388 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.257444 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.264688 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.601254 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56b9647d87-2qhmh"] Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.601785 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56b9647d87-2qhmh" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-api" containerID="cri-o://1b28dfe7d2202acd51cd4556f1c17c3749cac234f100a94b65f9772c04f97b62" gracePeriod=30 Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.602036 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56b9647d87-2qhmh" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-httpd" containerID="cri-o://97b1e09d6053130fb7ed8cd3a7a1b57d88faf0c15aceaa656cf6b72bf7bed517" gracePeriod=30 Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.617835 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-56b9647d87-2qhmh" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9696/\": read tcp 10.217.0.2:37418->10.217.0.151:9696: read: connection reset by peer" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.646022 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d979499f7-4flxt"] Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.647884 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.698095 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d979499f7-4flxt"] Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.745374 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.763334 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-config\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.763421 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-public-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.763691 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nntv4\" (UniqueName: \"kubernetes.io/projected/da263057-3652-4ae8-8435-4f80e4b13804-kube-api-access-nntv4\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.764835 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-httpd-config\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.765942 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-combined-ca-bundle\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.766093 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-ovndb-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.766185 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-internal-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869146 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-config\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869219 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-public-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869304 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nntv4\" (UniqueName: \"kubernetes.io/projected/da263057-3652-4ae8-8435-4f80e4b13804-kube-api-access-nntv4\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869326 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-httpd-config\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869396 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-combined-ca-bundle\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869416 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-ovndb-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869437 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-internal-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.889276 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-config\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.913163 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-httpd-config\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.914324 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-public-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.914819 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-internal-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.915049 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-ovndb-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.915587 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-df89bff66-xp7n4"] Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.915770 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-df89bff66-xp7n4" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api-log" containerID="cri-o://3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33" gracePeriod=30 Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.916158 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-df89bff66-xp7n4" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api" containerID="cri-o://f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad" gracePeriod=30 Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.932127 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-combined-ca-bundle\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.966784 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nntv4\" (UniqueName: \"kubernetes.io/projected/da263057-3652-4ae8-8435-4f80e4b13804-kube-api-access-nntv4\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.329141 4778 generic.go:334] "Generic (PLEG): container finished" podID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerID="3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33" exitCode=143 Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.329756 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df89bff66-xp7n4" event={"ID":"48e51d3c-7f9f-4196-b708-c28c0e7477fa","Type":"ContainerDied","Data":"3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33"} Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.355230 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" event={"ID":"ae2c97b2-c699-443a-b3b3-ecb22de258c2","Type":"ContainerDied","Data":"548920e3510d285dcfc8978dccbf1745880c114136f2e4109ae6de9a628d0437"} Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.355274 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="548920e3510d285dcfc8978dccbf1745880c114136f2e4109ae6de9a628d0437" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.356854 4778 generic.go:334] "Generic (PLEG): container finished" podID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerID="6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a" exitCode=137 Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.356874 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7967bcbb45-bl6c8" event={"ID":"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2","Type":"ContainerDied","Data":"6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a"} Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.662459 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.666957 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.680493 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.713160 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.805752 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-dns-svc\") pod \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.806329 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-sb\") pod \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.806419 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-config\") pod \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.806610 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mdv2\" (UniqueName: \"kubernetes.io/projected/ae2c97b2-c699-443a-b3b3-ecb22de258c2-kube-api-access-8mdv2\") pod \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.806643 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-nb\") pod \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.829472 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae2c97b2-c699-443a-b3b3-ecb22de258c2-kube-api-access-8mdv2" (OuterVolumeSpecName: "kube-api-access-8mdv2") pod "ae2c97b2-c699-443a-b3b3-ecb22de258c2" (UID: "ae2c97b2-c699-443a-b3b3-ecb22de258c2"). InnerVolumeSpecName "kube-api-access-8mdv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.908340 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bdaed74-310d-4589-9a14-8c862f05d378-logs\") pod \"3bdaed74-310d-4589-9a14-8c862f05d378\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.908481 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data\") pod \"3bdaed74-310d-4589-9a14-8c862f05d378\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.908855 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bdaed74-310d-4589-9a14-8c862f05d378-logs" (OuterVolumeSpecName: "logs") pod "3bdaed74-310d-4589-9a14-8c862f05d378" (UID: "3bdaed74-310d-4589-9a14-8c862f05d378"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.908973 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-combined-ca-bundle\") pod \"3bdaed74-310d-4589-9a14-8c862f05d378\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.909437 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data-custom\") pod \"3bdaed74-310d-4589-9a14-8c862f05d378\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.910212 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6bt4\" (UniqueName: \"kubernetes.io/projected/3bdaed74-310d-4589-9a14-8c862f05d378-kube-api-access-h6bt4\") pod \"3bdaed74-310d-4589-9a14-8c862f05d378\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.912171 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mdv2\" (UniqueName: \"kubernetes.io/projected/ae2c97b2-c699-443a-b3b3-ecb22de258c2-kube-api-access-8mdv2\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.912455 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bdaed74-310d-4589-9a14-8c862f05d378-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.933519 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3bdaed74-310d-4589-9a14-8c862f05d378" (UID: "3bdaed74-310d-4589-9a14-8c862f05d378"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.989957 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bdaed74-310d-4589-9a14-8c862f05d378-kube-api-access-h6bt4" (OuterVolumeSpecName: "kube-api-access-h6bt4") pod "3bdaed74-310d-4589-9a14-8c862f05d378" (UID: "3bdaed74-310d-4589-9a14-8c862f05d378"). InnerVolumeSpecName "kube-api-access-h6bt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.016547 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.016870 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6bt4\" (UniqueName: \"kubernetes.io/projected/3bdaed74-310d-4589-9a14-8c862f05d378-kube-api-access-h6bt4\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.045180 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae2c97b2-c699-443a-b3b3-ecb22de258c2" (UID: "ae2c97b2-c699-443a-b3b3-ecb22de258c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.048740 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ae2c97b2-c699-443a-b3b3-ecb22de258c2" (UID: "ae2c97b2-c699-443a-b3b3-ecb22de258c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.050781 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ae2c97b2-c699-443a-b3b3-ecb22de258c2" (UID: "ae2c97b2-c699-443a-b3b3-ecb22de258c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.053162 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-9xln9"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.061365 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-config" (OuterVolumeSpecName: "config") pod "ae2c97b2-c699-443a-b3b3-ecb22de258c2" (UID: "ae2c97b2-c699-443a-b3b3-ecb22de258c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.071327 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data" (OuterVolumeSpecName: "config-data") pod "3bdaed74-310d-4589-9a14-8c862f05d378" (UID: "3bdaed74-310d-4589-9a14-8c862f05d378"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.103776 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bdaed74-310d-4589-9a14-8c862f05d378" (UID: "3bdaed74-310d-4589-9a14-8c862f05d378"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.111331 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.121736 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data\") pod \"2290443f-8279-4b62-9d3d-bab0be1d7af5\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.121832 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksngs\" (UniqueName: \"kubernetes.io/projected/2290443f-8279-4b62-9d3d-bab0be1d7af5-kube-api-access-ksngs\") pod \"2290443f-8279-4b62-9d3d-bab0be1d7af5\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.121878 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-combined-ca-bundle\") pod \"2290443f-8279-4b62-9d3d-bab0be1d7af5\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.121901 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2290443f-8279-4b62-9d3d-bab0be1d7af5-logs\") pod \"2290443f-8279-4b62-9d3d-bab0be1d7af5\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.121936 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data-custom\") pod \"2290443f-8279-4b62-9d3d-bab0be1d7af5\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.122183 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.122212 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.122223 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.122232 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.122241 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.122251 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.125477 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2290443f-8279-4b62-9d3d-bab0be1d7af5-logs" (OuterVolumeSpecName: "logs") pod "2290443f-8279-4b62-9d3d-bab0be1d7af5" (UID: "2290443f-8279-4b62-9d3d-bab0be1d7af5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.149080 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2290443f-8279-4b62-9d3d-bab0be1d7af5" (UID: "2290443f-8279-4b62-9d3d-bab0be1d7af5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.164815 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2290443f-8279-4b62-9d3d-bab0be1d7af5-kube-api-access-ksngs" (OuterVolumeSpecName: "kube-api-access-ksngs") pod "2290443f-8279-4b62-9d3d-bab0be1d7af5" (UID: "2290443f-8279-4b62-9d3d-bab0be1d7af5"). InnerVolumeSpecName "kube-api-access-ksngs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.179613 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.231599 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksngs\" (UniqueName: \"kubernetes.io/projected/2290443f-8279-4b62-9d3d-bab0be1d7af5-kube-api-access-ksngs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.231636 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2290443f-8279-4b62-9d3d-bab0be1d7af5-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.231646 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.233481 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.253150 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.318270 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2290443f-8279-4b62-9d3d-bab0be1d7af5" (UID: "2290443f-8279-4b62-9d3d-bab0be1d7af5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.338340 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.349523 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.376456 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data" (OuterVolumeSpecName: "config-data") pod "2290443f-8279-4b62-9d3d-bab0be1d7af5" (UID: "2290443f-8279-4b62-9d3d-bab0be1d7af5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.407714 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.407961 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" event={"ID":"3bdaed74-310d-4589-9a14-8c862f05d378","Type":"ContainerDied","Data":"936e07a4b38cfd1b5b28cfe412a7dc4ba1163eceb3ae2617b1ff1b9e372215e3"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.408102 4778 scope.go:117] "RemoveContainer" containerID="e7e9f37ed0458c02870caf56e192acdd710a30f50cce72d6e9f4bfe098996a15" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.409873 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"25c30dcf-f49d-430b-a240-aefe036afeeb","Type":"ContainerStarted","Data":"3dd48f50ba7628f9f7d698453fd23e9975283871dd516f7a6292391921e1577b"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.423810 4778 generic.go:334] "Generic (PLEG): container finished" podID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerID="42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec" exitCode=137 Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.423868 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7967bcbb45-bl6c8" event={"ID":"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2","Type":"ContainerDied","Data":"42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.423889 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7967bcbb45-bl6c8" event={"ID":"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2","Type":"ContainerDied","Data":"0cbfa7c707107fef7fba1585919c6375feaacdc7b31da820c5b759e556860c71"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.424266 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.427020 4778 generic.go:334] "Generic (PLEG): container finished" podID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerID="97b1e09d6053130fb7ed8cd3a7a1b57d88faf0c15aceaa656cf6b72bf7bed517" exitCode=0 Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.427348 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b9647d87-2qhmh" event={"ID":"2b12c496-12d8-47e5-8cb7-134c3860368d","Type":"ContainerDied","Data":"97b1e09d6053130fb7ed8cd3a7a1b57d88faf0c15aceaa656cf6b72bf7bed517"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.438743 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679c749775-5x4dx" event={"ID":"2290443f-8279-4b62-9d3d-bab0be1d7af5","Type":"ContainerDied","Data":"44a3e56960042fea2cef1afef1f593fd3c5991c7f2d6bb676b183bc24c3832e2"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.439087 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.455729 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerStarted","Data":"e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.455914 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-central-agent" containerID="cri-o://7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372" gracePeriod=30 Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.456185 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.456446 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="proxy-httpd" containerID="cri-o://e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9" gracePeriod=30 Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.456490 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="sg-core" containerID="cri-o://19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4" gracePeriod=30 Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.456525 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-notification-agent" containerID="cri-o://d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053" gracePeriod=30 Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.466688 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.468024 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ef4b958-769d-43c2-91d4-3a6cb76d3851","Type":"ContainerStarted","Data":"158c8e2542c97971367812456984ca4e3f98182f67d2f9c5b6c2354ec14b4a85"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.470220 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.470943 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" event={"ID":"06a6d934-2f47-4628-a328-6ba9cefb8090","Type":"ContainerStarted","Data":"ff4d7228da7d0afd7376861dde18209e6f8e60848f76f48755825c7cbc2d2227"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.477487 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.479774 4778 scope.go:117] "RemoveContainer" containerID="358bb2fa0945b8176753e109d73c3f83c6eac33a4e8d6a66d2281628c5bd5f11" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.504273 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.528936 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.274576901 podStartE2EDuration="1m3.528907091s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="2026-03-18 09:22:55.961141974 +0000 UTC m=+1242.535886814" lastFinishedPulling="2026-03-18 09:23:55.215472164 +0000 UTC m=+1301.790217004" observedRunningTime="2026-03-18 09:23:56.481066688 +0000 UTC m=+1303.055811578" watchObservedRunningTime="2026-03-18 09:23:56.528907091 +0000 UTC m=+1303.103651931" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.538143 4778 scope.go:117] "RemoveContainer" containerID="42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.575901 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-679c749775-5x4dx"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.578383 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data\") pod \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.578603 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-logs\") pod \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.578805 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-scripts\") pod \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.578873 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-horizon-secret-key\") pod \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.578905 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l9tz\" (UniqueName: \"kubernetes.io/projected/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-kube-api-access-8l9tz\") pod \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.579081 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-logs" (OuterVolumeSpecName: "logs") pod "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" (UID: "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.579431 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.585409 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" (UID: "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.612453 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-kube-api-access-8l9tz" (OuterVolumeSpecName: "kube-api-access-8l9tz") pod "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" (UID: "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2"). InnerVolumeSpecName "kube-api-access-8l9tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.657138 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-679c749775-5x4dx"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.680894 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data" (OuterVolumeSpecName: "config-data") pod "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" (UID: "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.681931 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data\") pod \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " Mar 18 09:23:56 crc kubenswrapper[4778]: W0318 09:23:56.683481 4778 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2/volumes/kubernetes.io~configmap/config-data Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.683518 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data" (OuterVolumeSpecName: "config-data") pod "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" (UID: "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.686218 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-scripts" (OuterVolumeSpecName: "scripts") pod "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" (UID: "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.689303 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.689339 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.689353 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.689369 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l9tz\" (UniqueName: \"kubernetes.io/projected/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-kube-api-access-8l9tz\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.694249 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-p2knr"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.723550 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-p2knr"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.727917 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d979499f7-4flxt"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.785933 4778 scope.go:117] "RemoveContainer" containerID="6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.894348 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-56b9647d87-2qhmh" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9696/\": dial tcp 10.217.0.151:9696: connect: connection refused" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.967108 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7967bcbb45-bl6c8"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.978373 4778 scope.go:117] "RemoveContainer" containerID="42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec" Mar 18 09:23:56 crc kubenswrapper[4778]: E0318 09:23:56.980923 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec\": container with ID starting with 42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec not found: ID does not exist" containerID="42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.980964 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec"} err="failed to get container status \"42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec\": rpc error: code = NotFound desc = could not find container \"42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec\": container with ID starting with 42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec not found: ID does not exist" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.980989 4778 scope.go:117] "RemoveContainer" containerID="6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a" Mar 18 09:23:56 crc kubenswrapper[4778]: E0318 09:23:56.981470 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a\": container with ID starting with 6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a not found: ID does not exist" containerID="6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.981501 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a"} err="failed to get container status \"6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a\": rpc error: code = NotFound desc = could not find container \"6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a\": container with ID starting with 6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a not found: ID does not exist" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.981518 4778 scope.go:117] "RemoveContainer" containerID="fa720a6916b9236597d16b19d9dd408cdda0fb22a3b213c1f2335a3530e41d8d" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.992521 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7967bcbb45-bl6c8"] Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.191486 4778 scope.go:117] "RemoveContainer" containerID="e9f7b0df052a4e1bc6dea1dc19739579966633e37f974824883fab15b09a8f39" Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.553551 4778 generic.go:334] "Generic (PLEG): container finished" podID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerID="19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4" exitCode=2 Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.553777 4778 generic.go:334] "Generic (PLEG): container finished" podID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerID="7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372" exitCode=0 Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.553818 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerDied","Data":"19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4"} Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.553845 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerDied","Data":"7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372"} Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.561114 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ef4b958-769d-43c2-91d4-3a6cb76d3851","Type":"ContainerStarted","Data":"825b48041531c5b0c01f329609e8aff59582830809964a4de416fba30a539f48"} Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.563404 4778 generic.go:334] "Generic (PLEG): container finished" podID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerID="9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6" exitCode=0 Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.563467 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" event={"ID":"06a6d934-2f47-4628-a328-6ba9cefb8090","Type":"ContainerDied","Data":"9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6"} Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.580022 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d979499f7-4flxt" event={"ID":"da263057-3652-4ae8-8435-4f80e4b13804","Type":"ContainerStarted","Data":"3d0673c880513eabd90c660c0aca8f21e04bcaa48ea7b76a58c69bafee70f7d5"} Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.580256 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d979499f7-4flxt" event={"ID":"da263057-3652-4ae8-8435-4f80e4b13804","Type":"ContainerStarted","Data":"be750524dac8eb7fa21fef767ce16f355c71ddf39141f72d801e5166941cb488"} Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.580389 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.629349 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d979499f7-4flxt" podStartSLOduration=3.629325946 podStartE2EDuration="3.629325946s" podCreationTimestamp="2026-03-18 09:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:57.614679297 +0000 UTC m=+1304.189424137" watchObservedRunningTime="2026-03-18 09:23:57.629325946 +0000 UTC m=+1304.204070786" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.231351 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" path="/var/lib/kubelet/pods/2290443f-8279-4b62-9d3d-bab0be1d7af5/volumes" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.241502 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" path="/var/lib/kubelet/pods/3bdaed74-310d-4589-9a14-8c862f05d378/volumes" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.242412 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" path="/var/lib/kubelet/pods/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2/volumes" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.242978 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" path="/var/lib/kubelet/pods/ae2c97b2-c699-443a-b3b3-ecb22de258c2/volumes" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.533680 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.635414 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ef4b958-769d-43c2-91d4-3a6cb76d3851","Type":"ContainerStarted","Data":"c901fa502e83a3c43053baeb3fceb6b85a02d87dc36f812db5898d2cee7e9935"} Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.635746 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api-log" containerID="cri-o://825b48041531c5b0c01f329609e8aff59582830809964a4de416fba30a539f48" gracePeriod=30 Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.638563 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api" containerID="cri-o://c901fa502e83a3c43053baeb3fceb6b85a02d87dc36f812db5898d2cee7e9935" gracePeriod=30 Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.638778 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.644405 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" event={"ID":"06a6d934-2f47-4628-a328-6ba9cefb8090","Type":"ContainerStarted","Data":"623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42"} Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.645632 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.649881 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d979499f7-4flxt" event={"ID":"da263057-3652-4ae8-8435-4f80e4b13804","Type":"ContainerStarted","Data":"4303b578c6acd544585106fa3ebfc66574b0ed2f544b92af5efb19b9d55f68b4"} Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.654484 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"25c30dcf-f49d-430b-a240-aefe036afeeb","Type":"ContainerStarted","Data":"33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785"} Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.683407 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.683371457 podStartE2EDuration="7.683371457s" podCreationTimestamp="2026-03-18 09:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:58.681653761 +0000 UTC m=+1305.256398621" watchObservedRunningTime="2026-03-18 09:23:58.683371457 +0000 UTC m=+1305.258116297" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.725953 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" podStartSLOduration=7.725930876 podStartE2EDuration="7.725930876s" podCreationTimestamp="2026-03-18 09:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:58.711634476 +0000 UTC m=+1305.286379316" watchObservedRunningTime="2026-03-18 09:23:58.725930876 +0000 UTC m=+1305.300675716" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.865758 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.931210 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-99c8bfc86-rldfg"] Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.931543 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-99c8bfc86-rldfg" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon-log" containerID="cri-o://d66d99949ad5120620bffa75ea4a3e49aee89967a841972aa06a4a9867cff673" gracePeriod=30 Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.931720 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-99c8bfc86-rldfg" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" containerID="cri-o://c97b0574c01538fa1c2084f0fe2463a0e870b56db44f51730acba68dbd0ebf25" gracePeriod=30 Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.194283 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: i/o timeout" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.361884 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.463359 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm7dv\" (UniqueName: \"kubernetes.io/projected/48e51d3c-7f9f-4196-b708-c28c0e7477fa-kube-api-access-qm7dv\") pod \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.464799 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data-custom\") pod \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.464850 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-combined-ca-bundle\") pod \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.464906 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data\") pod \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.464971 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48e51d3c-7f9f-4196-b708-c28c0e7477fa-logs\") pod \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.465769 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48e51d3c-7f9f-4196-b708-c28c0e7477fa-logs" (OuterVolumeSpecName: "logs") pod "48e51d3c-7f9f-4196-b708-c28c0e7477fa" (UID: "48e51d3c-7f9f-4196-b708-c28c0e7477fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.475298 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "48e51d3c-7f9f-4196-b708-c28c0e7477fa" (UID: "48e51d3c-7f9f-4196-b708-c28c0e7477fa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.475508 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e51d3c-7f9f-4196-b708-c28c0e7477fa-kube-api-access-qm7dv" (OuterVolumeSpecName: "kube-api-access-qm7dv") pod "48e51d3c-7f9f-4196-b708-c28c0e7477fa" (UID: "48e51d3c-7f9f-4196-b708-c28c0e7477fa"). InnerVolumeSpecName "kube-api-access-qm7dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.515506 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48e51d3c-7f9f-4196-b708-c28c0e7477fa" (UID: "48e51d3c-7f9f-4196-b708-c28c0e7477fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.544425 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data" (OuterVolumeSpecName: "config-data") pod "48e51d3c-7f9f-4196-b708-c28c0e7477fa" (UID: "48e51d3c-7f9f-4196-b708-c28c0e7477fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.567145 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm7dv\" (UniqueName: \"kubernetes.io/projected/48e51d3c-7f9f-4196-b708-c28c0e7477fa-kube-api-access-qm7dv\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.567182 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.567225 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.567239 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.567250 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48e51d3c-7f9f-4196-b708-c28c0e7477fa-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.670912 4778 generic.go:334] "Generic (PLEG): container finished" podID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerID="825b48041531c5b0c01f329609e8aff59582830809964a4de416fba30a539f48" exitCode=143 Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.670997 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ef4b958-769d-43c2-91d4-3a6cb76d3851","Type":"ContainerDied","Data":"825b48041531c5b0c01f329609e8aff59582830809964a4de416fba30a539f48"} Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.673959 4778 generic.go:334] "Generic (PLEG): container finished" podID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerID="f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad" exitCode=0 Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.674021 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df89bff66-xp7n4" event={"ID":"48e51d3c-7f9f-4196-b708-c28c0e7477fa","Type":"ContainerDied","Data":"f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad"} Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.674049 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df89bff66-xp7n4" event={"ID":"48e51d3c-7f9f-4196-b708-c28c0e7477fa","Type":"ContainerDied","Data":"b165bcb9da7478d0edb4f2c1e6142d197b4e354bbdd69e9690efc132cc26e106"} Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.674068 4778 scope.go:117] "RemoveContainer" containerID="f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.674242 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.679403 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"25c30dcf-f49d-430b-a240-aefe036afeeb","Type":"ContainerStarted","Data":"7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78"} Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.692699 4778 generic.go:334] "Generic (PLEG): container finished" podID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerID="1b28dfe7d2202acd51cd4556f1c17c3749cac234f100a94b65f9772c04f97b62" exitCode=0 Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.693576 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b9647d87-2qhmh" event={"ID":"2b12c496-12d8-47e5-8cb7-134c3860368d","Type":"ContainerDied","Data":"1b28dfe7d2202acd51cd4556f1c17c3749cac234f100a94b65f9772c04f97b62"} Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.717769 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.71291037 podStartE2EDuration="8.717752322s" podCreationTimestamp="2026-03-18 09:23:51 +0000 UTC" firstStartedPulling="2026-03-18 09:23:56.252563733 +0000 UTC m=+1302.827308573" lastFinishedPulling="2026-03-18 09:23:57.257405685 +0000 UTC m=+1303.832150525" observedRunningTime="2026-03-18 09:23:59.707092463 +0000 UTC m=+1306.281837323" watchObservedRunningTime="2026-03-18 09:23:59.717752322 +0000 UTC m=+1306.292497162" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.735973 4778 scope.go:117] "RemoveContainer" containerID="3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.741694 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-df89bff66-xp7n4"] Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.762336 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-df89bff66-xp7n4"] Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.824273 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.833729 4778 scope.go:117] "RemoveContainer" containerID="f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad" Mar 18 09:23:59 crc kubenswrapper[4778]: E0318 09:23:59.834262 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad\": container with ID starting with f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad not found: ID does not exist" containerID="f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.834309 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad"} err="failed to get container status \"f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad\": rpc error: code = NotFound desc = could not find container \"f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad\": container with ID starting with f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad not found: ID does not exist" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.834335 4778 scope.go:117] "RemoveContainer" containerID="3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33" Mar 18 09:23:59 crc kubenswrapper[4778]: E0318 09:23:59.834701 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33\": container with ID starting with 3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33 not found: ID does not exist" containerID="3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.834733 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33"} err="failed to get container status \"3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33\": rpc error: code = NotFound desc = could not find container \"3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33\": container with ID starting with 3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33 not found: ID does not exist" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.880624 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-combined-ca-bundle\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.880707 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-internal-tls-certs\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.880756 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-config\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.880891 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-public-tls-certs\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.880962 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-httpd-config\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.881006 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-ovndb-tls-certs\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.881054 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb97b\" (UniqueName: \"kubernetes.io/projected/2b12c496-12d8-47e5-8cb7-134c3860368d-kube-api-access-qb97b\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.892469 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b12c496-12d8-47e5-8cb7-134c3860368d-kube-api-access-qb97b" (OuterVolumeSpecName: "kube-api-access-qb97b") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "kube-api-access-qb97b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.900757 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.936895 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.957040 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-config" (OuterVolumeSpecName: "config") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.959772 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.968419 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.983808 4778 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.983840 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb97b\" (UniqueName: \"kubernetes.io/projected/2b12c496-12d8-47e5-8cb7-134c3860368d-kube-api-access-qb97b\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.983851 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.983860 4778 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.983869 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.983878 4778 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.992443 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.085222 4778 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145513 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563764-s4crc"] Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145840 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145857 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145870 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-api" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145876 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-api" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145888 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="dnsmasq-dns" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145894 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="dnsmasq-dns" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145908 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-httpd" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145915 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-httpd" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145928 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145933 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon-log" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145944 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145979 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145992 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145998 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.146011 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="init" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146018 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="init" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.146028 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146034 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener-log" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.146043 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146049 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api-log" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.146059 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146066 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.146074 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146079 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146278 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146293 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="dnsmasq-dns" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146303 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146313 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146321 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-httpd" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146330 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146342 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-api" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146353 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146364 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146374 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146383 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146916 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.149438 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.151571 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.151924 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.170471 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563764-s4crc"] Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.187111 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlmf5\" (UniqueName: \"kubernetes.io/projected/bb3b2d75-fc85-48dc-8533-18ecd8c75187-kube-api-access-xlmf5\") pod \"auto-csr-approver-29563764-s4crc\" (UID: \"bb3b2d75-fc85-48dc-8533-18ecd8c75187\") " pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.200410 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" path="/var/lib/kubelet/pods/48e51d3c-7f9f-4196-b708-c28c0e7477fa/volumes" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.289347 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlmf5\" (UniqueName: \"kubernetes.io/projected/bb3b2d75-fc85-48dc-8533-18ecd8c75187-kube-api-access-xlmf5\") pod \"auto-csr-approver-29563764-s4crc\" (UID: \"bb3b2d75-fc85-48dc-8533-18ecd8c75187\") " pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.307841 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlmf5\" (UniqueName: \"kubernetes.io/projected/bb3b2d75-fc85-48dc-8533-18ecd8c75187-kube-api-access-xlmf5\") pod \"auto-csr-approver-29563764-s4crc\" (UID: \"bb3b2d75-fc85-48dc-8533-18ecd8c75187\") " pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.465531 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.718907 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b9647d87-2qhmh" event={"ID":"2b12c496-12d8-47e5-8cb7-134c3860368d","Type":"ContainerDied","Data":"e8df8ad489699b62f3719756fb074fcb6bc53de4a98faae3059fd18d69bd24b5"} Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.719393 4778 scope.go:117] "RemoveContainer" containerID="97b1e09d6053130fb7ed8cd3a7a1b57d88faf0c15aceaa656cf6b72bf7bed517" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.720273 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.784503 4778 scope.go:117] "RemoveContainer" containerID="1b28dfe7d2202acd51cd4556f1c17c3749cac234f100a94b65f9772c04f97b62" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.785893 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56b9647d87-2qhmh"] Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.794089 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56b9647d87-2qhmh"] Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.928380 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563764-s4crc"] Mar 18 09:24:00 crc kubenswrapper[4778]: W0318 09:24:00.936383 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb3b2d75_fc85_48dc_8533_18ecd8c75187.slice/crio-fbfc405090c6b9b3a4cbae242297821fa7e208a6f4adf0dc492b8aad2c4c1a95 WatchSource:0}: Error finding container fbfc405090c6b9b3a4cbae242297821fa7e208a6f4adf0dc492b8aad2c4c1a95: Status 404 returned error can't find the container with id fbfc405090c6b9b3a4cbae242297821fa7e208a6f4adf0dc492b8aad2c4c1a95 Mar 18 09:24:01 crc kubenswrapper[4778]: I0318 09:24:01.712677 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 09:24:01 crc kubenswrapper[4778]: I0318 09:24:01.731210 4778 generic.go:334] "Generic (PLEG): container finished" podID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerID="d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053" exitCode=0 Mar 18 09:24:01 crc kubenswrapper[4778]: I0318 09:24:01.731300 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerDied","Data":"d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053"} Mar 18 09:24:01 crc kubenswrapper[4778]: I0318 09:24:01.732491 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563764-s4crc" event={"ID":"bb3b2d75-fc85-48dc-8533-18ecd8c75187","Type":"ContainerStarted","Data":"fbfc405090c6b9b3a4cbae242297821fa7e208a6f4adf0dc492b8aad2c4c1a95"} Mar 18 09:24:02 crc kubenswrapper[4778]: I0318 09:24:02.198972 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" path="/var/lib/kubelet/pods/2b12c496-12d8-47e5-8cb7-134c3860368d/volumes" Mar 18 09:24:02 crc kubenswrapper[4778]: I0318 09:24:02.744305 4778 generic.go:334] "Generic (PLEG): container finished" podID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerID="c97b0574c01538fa1c2084f0fe2463a0e870b56db44f51730acba68dbd0ebf25" exitCode=0 Mar 18 09:24:02 crc kubenswrapper[4778]: I0318 09:24:02.744656 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-99c8bfc86-rldfg" event={"ID":"ea1f8a48-d595-4f4e-a740-0af5a26397a5","Type":"ContainerDied","Data":"c97b0574c01538fa1c2084f0fe2463a0e870b56db44f51730acba68dbd0ebf25"} Mar 18 09:24:02 crc kubenswrapper[4778]: I0318 09:24:02.746729 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563764-s4crc" event={"ID":"bb3b2d75-fc85-48dc-8533-18ecd8c75187","Type":"ContainerStarted","Data":"2ec42f2618fc279e3b11e295de9307609aec937968481eb47ae40bf89eeec176"} Mar 18 09:24:02 crc kubenswrapper[4778]: I0318 09:24:02.780544 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563764-s4crc" podStartSLOduration=1.514659989 podStartE2EDuration="2.78051597s" podCreationTimestamp="2026-03-18 09:24:00 +0000 UTC" firstStartedPulling="2026-03-18 09:24:00.940768897 +0000 UTC m=+1307.515513737" lastFinishedPulling="2026-03-18 09:24:02.206624878 +0000 UTC m=+1308.781369718" observedRunningTime="2026-03-18 09:24:02.764146574 +0000 UTC m=+1309.338891454" watchObservedRunningTime="2026-03-18 09:24:02.78051597 +0000 UTC m=+1309.355260840" Mar 18 09:24:02 crc kubenswrapper[4778]: I0318 09:24:02.783345 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-99c8bfc86-rldfg" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Mar 18 09:24:03 crc kubenswrapper[4778]: I0318 09:24:03.756730 4778 generic.go:334] "Generic (PLEG): container finished" podID="bb3b2d75-fc85-48dc-8533-18ecd8c75187" containerID="2ec42f2618fc279e3b11e295de9307609aec937968481eb47ae40bf89eeec176" exitCode=0 Mar 18 09:24:03 crc kubenswrapper[4778]: I0318 09:24:03.756837 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563764-s4crc" event={"ID":"bb3b2d75-fc85-48dc-8533-18ecd8c75187","Type":"ContainerDied","Data":"2ec42f2618fc279e3b11e295de9307609aec937968481eb47ae40bf89eeec176"} Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.208292 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.301160 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlmf5\" (UniqueName: \"kubernetes.io/projected/bb3b2d75-fc85-48dc-8533-18ecd8c75187-kube-api-access-xlmf5\") pod \"bb3b2d75-fc85-48dc-8533-18ecd8c75187\" (UID: \"bb3b2d75-fc85-48dc-8533-18ecd8c75187\") " Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.306628 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3b2d75-fc85-48dc-8533-18ecd8c75187-kube-api-access-xlmf5" (OuterVolumeSpecName: "kube-api-access-xlmf5") pod "bb3b2d75-fc85-48dc-8533-18ecd8c75187" (UID: "bb3b2d75-fc85-48dc-8533-18ecd8c75187"). InnerVolumeSpecName "kube-api-access-xlmf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.403649 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlmf5\" (UniqueName: \"kubernetes.io/projected/bb3b2d75-fc85-48dc-8533-18ecd8c75187-kube-api-access-xlmf5\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.780815 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563764-s4crc" event={"ID":"bb3b2d75-fc85-48dc-8533-18ecd8c75187","Type":"ContainerDied","Data":"fbfc405090c6b9b3a4cbae242297821fa7e208a6f4adf0dc492b8aad2c4c1a95"} Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.780851 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbfc405090c6b9b3a4cbae242297821fa7e208a6f4adf0dc492b8aad2c4c1a95" Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.781185 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.866479 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563758-zslfz"] Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.883822 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563758-zslfz"] Mar 18 09:24:06 crc kubenswrapper[4778]: I0318 09:24:06.205839 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70faab0-9f07-4452-a873-bcb59d28b7a8" path="/var/lib/kubelet/pods/c70faab0-9f07-4452-a873-bcb59d28b7a8/volumes" Mar 18 09:24:06 crc kubenswrapper[4778]: I0318 09:24:06.836375 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:24:06 crc kubenswrapper[4778]: I0318 09:24:06.903005 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-2zrdr"] Mar 18 09:24:06 crc kubenswrapper[4778]: I0318 09:24:06.903295 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerName="dnsmasq-dns" containerID="cri-o://9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed" gracePeriod=10 Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.045273 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.096428 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.487262 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.546833 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-nb\") pod \"1cc8be70-e875-4307-89a5-9cbb0d105b86\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.547043 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-sb\") pod \"1cc8be70-e875-4307-89a5-9cbb0d105b86\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.547190 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-config\") pod \"1cc8be70-e875-4307-89a5-9cbb0d105b86\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.547276 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrb4m\" (UniqueName: \"kubernetes.io/projected/1cc8be70-e875-4307-89a5-9cbb0d105b86-kube-api-access-nrb4m\") pod \"1cc8be70-e875-4307-89a5-9cbb0d105b86\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.548277 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-dns-svc\") pod \"1cc8be70-e875-4307-89a5-9cbb0d105b86\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.558881 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc8be70-e875-4307-89a5-9cbb0d105b86-kube-api-access-nrb4m" (OuterVolumeSpecName: "kube-api-access-nrb4m") pod "1cc8be70-e875-4307-89a5-9cbb0d105b86" (UID: "1cc8be70-e875-4307-89a5-9cbb0d105b86"). InnerVolumeSpecName "kube-api-access-nrb4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.650958 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrb4m\" (UniqueName: \"kubernetes.io/projected/1cc8be70-e875-4307-89a5-9cbb0d105b86-kube-api-access-nrb4m\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.735829 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1cc8be70-e875-4307-89a5-9cbb0d105b86" (UID: "1cc8be70-e875-4307-89a5-9cbb0d105b86"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.735851 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1cc8be70-e875-4307-89a5-9cbb0d105b86" (UID: "1cc8be70-e875-4307-89a5-9cbb0d105b86"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.736275 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-config" (OuterVolumeSpecName: "config") pod "1cc8be70-e875-4307-89a5-9cbb0d105b86" (UID: "1cc8be70-e875-4307-89a5-9cbb0d105b86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.736934 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1cc8be70-e875-4307-89a5-9cbb0d105b86" (UID: "1cc8be70-e875-4307-89a5-9cbb0d105b86"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.752901 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.753158 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.753291 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.753387 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.802089 4778 generic.go:334] "Generic (PLEG): container finished" podID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerID="9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed" exitCode=0 Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.802450 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="cinder-scheduler" containerID="cri-o://33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785" gracePeriod=30 Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.802879 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.805295 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" event={"ID":"1cc8be70-e875-4307-89a5-9cbb0d105b86","Type":"ContainerDied","Data":"9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed"} Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.805352 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" event={"ID":"1cc8be70-e875-4307-89a5-9cbb0d105b86","Type":"ContainerDied","Data":"cc995beb30e43a9c4d675261cc6a1346b30a1e32f69b240f14fdcb175b6ff16e"} Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.805381 4778 scope.go:117] "RemoveContainer" containerID="9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.805933 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="probe" containerID="cri-o://7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78" gracePeriod=30 Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.828985 4778 scope.go:117] "RemoveContainer" containerID="5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.837895 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-2zrdr"] Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.847189 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-2zrdr"] Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.863566 4778 scope.go:117] "RemoveContainer" containerID="9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed" Mar 18 09:24:07 crc kubenswrapper[4778]: E0318 09:24:07.864029 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed\": container with ID starting with 9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed not found: ID does not exist" containerID="9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.864067 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed"} err="failed to get container status \"9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed\": rpc error: code = NotFound desc = could not find container \"9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed\": container with ID starting with 9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed not found: ID does not exist" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.864094 4778 scope.go:117] "RemoveContainer" containerID="5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8" Mar 18 09:24:07 crc kubenswrapper[4778]: E0318 09:24:07.864470 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8\": container with ID starting with 5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8 not found: ID does not exist" containerID="5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.864489 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8"} err="failed to get container status \"5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8\": rpc error: code = NotFound desc = could not find container \"5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8\": container with ID starting with 5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8 not found: ID does not exist" Mar 18 09:24:08 crc kubenswrapper[4778]: I0318 09:24:08.198629 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" path="/var/lib/kubelet/pods/1cc8be70-e875-4307-89a5-9cbb0d105b86/volumes" Mar 18 09:24:08 crc kubenswrapper[4778]: I0318 09:24:08.815995 4778 generic.go:334] "Generic (PLEG): container finished" podID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerID="7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78" exitCode=0 Mar 18 09:24:08 crc kubenswrapper[4778]: I0318 09:24:08.816064 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"25c30dcf-f49d-430b-a240-aefe036afeeb","Type":"ContainerDied","Data":"7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78"} Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.091032 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.407784 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.462612 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.699424 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7588d8786-t6x7l"] Mar 18 09:24:09 crc kubenswrapper[4778]: E0318 09:24:09.699782 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3b2d75-fc85-48dc-8533-18ecd8c75187" containerName="oc" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.699798 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3b2d75-fc85-48dc-8533-18ecd8c75187" containerName="oc" Mar 18 09:24:09 crc kubenswrapper[4778]: E0318 09:24:09.699812 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerName="init" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.699820 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerName="init" Mar 18 09:24:09 crc kubenswrapper[4778]: E0318 09:24:09.699835 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerName="dnsmasq-dns" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.699843 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerName="dnsmasq-dns" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.699992 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerName="dnsmasq-dns" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.700007 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3b2d75-fc85-48dc-8533-18ecd8c75187" containerName="oc" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.700833 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.774091 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7588d8786-t6x7l"] Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794234 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqf7m\" (UniqueName: \"kubernetes.io/projected/fe0de426-6927-42ea-8b29-8bc01c27fe69-kube-api-access-qqf7m\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794290 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-internal-tls-certs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794413 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0de426-6927-42ea-8b29-8bc01c27fe69-logs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794447 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-combined-ca-bundle\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794483 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-config-data\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794505 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-public-tls-certs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794543 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-scripts\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.896602 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-config-data\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.896669 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-public-tls-certs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.896745 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-scripts\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.896809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqf7m\" (UniqueName: \"kubernetes.io/projected/fe0de426-6927-42ea-8b29-8bc01c27fe69-kube-api-access-qqf7m\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.896845 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-internal-tls-certs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.898340 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0de426-6927-42ea-8b29-8bc01c27fe69-logs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.898451 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-combined-ca-bundle\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.898732 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0de426-6927-42ea-8b29-8bc01c27fe69-logs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.903482 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-internal-tls-certs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.903951 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-public-tls-certs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.904836 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-combined-ca-bundle\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.910070 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-config-data\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.910069 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-scripts\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.922085 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqf7m\" (UniqueName: \"kubernetes.io/projected/fe0de426-6927-42ea-8b29-8bc01c27fe69-kube-api-access-qqf7m\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.036851 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.259667 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.623028 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7588d8786-t6x7l"] Mar 18 09:24:10 crc kubenswrapper[4778]: W0318 09:24:10.631471 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe0de426_6927_42ea_8b29_8bc01c27fe69.slice/crio-e3cc921221ed90cdcc3c47fb2bda02a8ea0bdcdd4808d3e2a3081aecc5ad2388 WatchSource:0}: Error finding container e3cc921221ed90cdcc3c47fb2bda02a8ea0bdcdd4808d3e2a3081aecc5ad2388: Status 404 returned error can't find the container with id e3cc921221ed90cdcc3c47fb2bda02a8ea0bdcdd4808d3e2a3081aecc5ad2388 Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.635275 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.719837 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data\") pod \"25c30dcf-f49d-430b-a240-aefe036afeeb\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.719896 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mnd9\" (UniqueName: \"kubernetes.io/projected/25c30dcf-f49d-430b-a240-aefe036afeeb-kube-api-access-5mnd9\") pod \"25c30dcf-f49d-430b-a240-aefe036afeeb\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.719954 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-scripts\") pod \"25c30dcf-f49d-430b-a240-aefe036afeeb\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.719979 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-combined-ca-bundle\") pod \"25c30dcf-f49d-430b-a240-aefe036afeeb\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.720122 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data-custom\") pod \"25c30dcf-f49d-430b-a240-aefe036afeeb\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.720155 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c30dcf-f49d-430b-a240-aefe036afeeb-etc-machine-id\") pod \"25c30dcf-f49d-430b-a240-aefe036afeeb\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.720662 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25c30dcf-f49d-430b-a240-aefe036afeeb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "25c30dcf-f49d-430b-a240-aefe036afeeb" (UID: "25c30dcf-f49d-430b-a240-aefe036afeeb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.725139 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-scripts" (OuterVolumeSpecName: "scripts") pod "25c30dcf-f49d-430b-a240-aefe036afeeb" (UID: "25c30dcf-f49d-430b-a240-aefe036afeeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.725246 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "25c30dcf-f49d-430b-a240-aefe036afeeb" (UID: "25c30dcf-f49d-430b-a240-aefe036afeeb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.725714 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c30dcf-f49d-430b-a240-aefe036afeeb-kube-api-access-5mnd9" (OuterVolumeSpecName: "kube-api-access-5mnd9") pod "25c30dcf-f49d-430b-a240-aefe036afeeb" (UID: "25c30dcf-f49d-430b-a240-aefe036afeeb"). InnerVolumeSpecName "kube-api-access-5mnd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.771067 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25c30dcf-f49d-430b-a240-aefe036afeeb" (UID: "25c30dcf-f49d-430b-a240-aefe036afeeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.822176 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mnd9\" (UniqueName: \"kubernetes.io/projected/25c30dcf-f49d-430b-a240-aefe036afeeb-kube-api-access-5mnd9\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.822223 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.822234 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.822248 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.822258 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c30dcf-f49d-430b-a240-aefe036afeeb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.824100 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data" (OuterVolumeSpecName: "config-data") pod "25c30dcf-f49d-430b-a240-aefe036afeeb" (UID: "25c30dcf-f49d-430b-a240-aefe036afeeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.834160 4778 generic.go:334] "Generic (PLEG): container finished" podID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerID="33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785" exitCode=0 Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.834252 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"25c30dcf-f49d-430b-a240-aefe036afeeb","Type":"ContainerDied","Data":"33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785"} Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.834277 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.834286 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"25c30dcf-f49d-430b-a240-aefe036afeeb","Type":"ContainerDied","Data":"3dd48f50ba7628f9f7d698453fd23e9975283871dd516f7a6292391921e1577b"} Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.834307 4778 scope.go:117] "RemoveContainer" containerID="7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.839358 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7588d8786-t6x7l" event={"ID":"fe0de426-6927-42ea-8b29-8bc01c27fe69","Type":"ContainerStarted","Data":"e3cc921221ed90cdcc3c47fb2bda02a8ea0bdcdd4808d3e2a3081aecc5ad2388"} Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.865541 4778 scope.go:117] "RemoveContainer" containerID="33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.870109 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.880722 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.896547 4778 scope.go:117] "RemoveContainer" containerID="7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78" Mar 18 09:24:10 crc kubenswrapper[4778]: E0318 09:24:10.897179 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78\": container with ID starting with 7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78 not found: ID does not exist" containerID="7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.897237 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78"} err="failed to get container status \"7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78\": rpc error: code = NotFound desc = could not find container \"7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78\": container with ID starting with 7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78 not found: ID does not exist" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.897264 4778 scope.go:117] "RemoveContainer" containerID="33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785" Mar 18 09:24:10 crc kubenswrapper[4778]: E0318 09:24:10.897555 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785\": container with ID starting with 33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785 not found: ID does not exist" containerID="33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.897586 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785"} err="failed to get container status \"33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785\": rpc error: code = NotFound desc = could not find container \"33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785\": container with ID starting with 33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785 not found: ID does not exist" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.903452 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:24:10 crc kubenswrapper[4778]: E0318 09:24:10.904344 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="probe" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.904374 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="probe" Mar 18 09:24:10 crc kubenswrapper[4778]: E0318 09:24:10.904404 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="cinder-scheduler" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.904414 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="cinder-scheduler" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.904666 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="probe" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.904730 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="cinder-scheduler" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.906410 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.909436 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.924941 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.924966 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.925437 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-scripts\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.925640 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.925736 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bbde13ad-dacc-4f17-8da3-109ede6972c0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.925842 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6vkn\" (UniqueName: \"kubernetes.io/projected/bbde13ad-dacc-4f17-8da3-109ede6972c0-kube-api-access-p6vkn\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.925919 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-config-data\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.926039 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.027593 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.028137 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bbde13ad-dacc-4f17-8da3-109ede6972c0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.028233 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6vkn\" (UniqueName: \"kubernetes.io/projected/bbde13ad-dacc-4f17-8da3-109ede6972c0-kube-api-access-p6vkn\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.028267 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-config-data\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.028345 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.028387 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bbde13ad-dacc-4f17-8da3-109ede6972c0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.028399 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-scripts\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.031326 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.032516 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-scripts\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.033861 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-config-data\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.034066 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.048750 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6vkn\" (UniqueName: \"kubernetes.io/projected/bbde13ad-dacc-4f17-8da3-109ede6972c0-kube-api-access-p6vkn\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.154686 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.455259 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.853957 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bbde13ad-dacc-4f17-8da3-109ede6972c0","Type":"ContainerStarted","Data":"b68a2f1177e8618f5371c42c181f2a2a72c3ec262ce9c4255d1091a4422376d1"} Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.859820 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7588d8786-t6x7l" event={"ID":"fe0de426-6927-42ea-8b29-8bc01c27fe69","Type":"ContainerStarted","Data":"70f41166643e5c57059949763b954cffb1995b94972c432b8661eaffd305357c"} Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.859850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7588d8786-t6x7l" event={"ID":"fe0de426-6927-42ea-8b29-8bc01c27fe69","Type":"ContainerStarted","Data":"e42aa40ce87ef108a835e400ee7d0af65765c83a28c90ec93577dc2879e7039c"} Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.860424 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.890570 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7588d8786-t6x7l" podStartSLOduration=2.890442807 podStartE2EDuration="2.890442807s" podCreationTimestamp="2026-03-18 09:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:24:11.883673602 +0000 UTC m=+1318.458418452" watchObservedRunningTime="2026-03-18 09:24:11.890442807 +0000 UTC m=+1318.465187657" Mar 18 09:24:12 crc kubenswrapper[4778]: I0318 09:24:12.219684 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" path="/var/lib/kubelet/pods/25c30dcf-f49d-430b-a240-aefe036afeeb/volumes" Mar 18 09:24:12 crc kubenswrapper[4778]: I0318 09:24:12.783288 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-99c8bfc86-rldfg" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Mar 18 09:24:12 crc kubenswrapper[4778]: I0318 09:24:12.885970 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bbde13ad-dacc-4f17-8da3-109ede6972c0","Type":"ContainerStarted","Data":"b1277af22eaff22c034718b40d19fb0722f73ea78fc1e330fcacc9be3f0bbfc1"} Mar 18 09:24:12 crc kubenswrapper[4778]: I0318 09:24:12.886037 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bbde13ad-dacc-4f17-8da3-109ede6972c0","Type":"ContainerStarted","Data":"5bc9419baaf6a9eba429b9f4b0e19c573040ff85ad0dec56617879e900848eec"} Mar 18 09:24:12 crc kubenswrapper[4778]: I0318 09:24:12.886120 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:12 crc kubenswrapper[4778]: I0318 09:24:12.902769 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.902749161 podStartE2EDuration="2.902749161s" podCreationTimestamp="2026-03-18 09:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:24:12.90091239 +0000 UTC m=+1319.475657230" watchObservedRunningTime="2026-03-18 09:24:12.902749161 +0000 UTC m=+1319.477494001" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.280131 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.282410 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.287913 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.288114 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-vqzw8" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.288284 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.299186 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.318251 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl5jq\" (UniqueName: \"kubernetes.io/projected/fec302c3-e5fc-4019-b4f5-50de6bdde59f-kube-api-access-hl5jq\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.318299 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fec302c3-e5fc-4019-b4f5-50de6bdde59f-openstack-config\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.318335 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec302c3-e5fc-4019-b4f5-50de6bdde59f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.318413 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fec302c3-e5fc-4019-b4f5-50de6bdde59f-openstack-config-secret\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.420487 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl5jq\" (UniqueName: \"kubernetes.io/projected/fec302c3-e5fc-4019-b4f5-50de6bdde59f-kube-api-access-hl5jq\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.420547 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fec302c3-e5fc-4019-b4f5-50de6bdde59f-openstack-config\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.420596 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec302c3-e5fc-4019-b4f5-50de6bdde59f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.420697 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fec302c3-e5fc-4019-b4f5-50de6bdde59f-openstack-config-secret\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.421845 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fec302c3-e5fc-4019-b4f5-50de6bdde59f-openstack-config\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.429019 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec302c3-e5fc-4019-b4f5-50de6bdde59f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.429893 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fec302c3-e5fc-4019-b4f5-50de6bdde59f-openstack-config-secret\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.439765 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl5jq\" (UniqueName: \"kubernetes.io/projected/fec302c3-e5fc-4019-b4f5-50de6bdde59f-kube-api-access-hl5jq\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.632832 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 09:24:15 crc kubenswrapper[4778]: I0318 09:24:15.131693 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 09:24:15 crc kubenswrapper[4778]: I0318 09:24:15.933675 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fec302c3-e5fc-4019-b4f5-50de6bdde59f","Type":"ContainerStarted","Data":"374277d6bb2e47d21523d2e4fedd07b943036d6c14091bdefdbcc052923e0497"} Mar 18 09:24:16 crc kubenswrapper[4778]: I0318 09:24:16.155289 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 09:24:21 crc kubenswrapper[4778]: I0318 09:24:21.381528 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 09:24:22 crc kubenswrapper[4778]: I0318 09:24:22.783056 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-99c8bfc86-rldfg" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Mar 18 09:24:22 crc kubenswrapper[4778]: I0318 09:24:22.783151 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:24:24 crc kubenswrapper[4778]: I0318 09:24:24.298770 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 09:24:25 crc kubenswrapper[4778]: I0318 09:24:25.677886 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:24:25 crc kubenswrapper[4778]: I0318 09:24:25.762475 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cffc84f44-vtx7x"] Mar 18 09:24:25 crc kubenswrapper[4778]: I0318 09:24:25.762700 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-cffc84f44-vtx7x" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-api" containerID="cri-o://5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827" gracePeriod=30 Mar 18 09:24:25 crc kubenswrapper[4778]: I0318 09:24:25.763108 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-cffc84f44-vtx7x" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-httpd" containerID="cri-o://6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3" gracePeriod=30 Mar 18 09:24:26 crc kubenswrapper[4778]: I0318 09:24:26.035870 4778 generic.go:334] "Generic (PLEG): container finished" podID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerID="6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3" exitCode=0 Mar 18 09:24:26 crc kubenswrapper[4778]: I0318 09:24:26.035924 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cffc84f44-vtx7x" event={"ID":"3d1d399f-3c89-4aaa-bba1-ce1a91358455","Type":"ContainerDied","Data":"6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3"} Mar 18 09:24:26 crc kubenswrapper[4778]: I0318 09:24:26.037481 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fec302c3-e5fc-4019-b4f5-50de6bdde59f","Type":"ContainerStarted","Data":"f1cac91237d6d6c46aeb8af41803efc6b4b3f1b77c65947a4636db5680599973"} Mar 18 09:24:26 crc kubenswrapper[4778]: I0318 09:24:26.060695 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.9306582479999999 podStartE2EDuration="12.060679382s" podCreationTimestamp="2026-03-18 09:24:14 +0000 UTC" firstStartedPulling="2026-03-18 09:24:15.141494592 +0000 UTC m=+1321.716239452" lastFinishedPulling="2026-03-18 09:24:25.271515746 +0000 UTC m=+1331.846260586" observedRunningTime="2026-03-18 09:24:26.053251609 +0000 UTC m=+1332.627996459" watchObservedRunningTime="2026-03-18 09:24:26.060679382 +0000 UTC m=+1332.635424222" Mar 18 09:24:26 crc kubenswrapper[4778]: I0318 09:24:26.966985 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.053746 4778 generic.go:334] "Generic (PLEG): container finished" podID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerID="e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9" exitCode=137 Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.053818 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.053856 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerDied","Data":"e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9"} Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.053922 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerDied","Data":"08fbe88aeb204ddd782e3073f280061d837a707d2c10f9b95b4eb6828823ed41"} Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.053948 4778 scope.go:117] "RemoveContainer" containerID="e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.105515 4778 scope.go:117] "RemoveContainer" containerID="19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.116469 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-config-data\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.116658 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-scripts\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.116734 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-run-httpd\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.117289 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-combined-ca-bundle\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.117367 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t77rr\" (UniqueName: \"kubernetes.io/projected/d609db91-e011-4ac4-91a6-a9ba51f3918e-kube-api-access-t77rr\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.117418 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-sg-core-conf-yaml\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.117456 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.117468 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-log-httpd\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.118282 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.118326 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.124466 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-scripts" (OuterVolumeSpecName: "scripts") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.129932 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d609db91-e011-4ac4-91a6-a9ba51f3918e-kube-api-access-t77rr" (OuterVolumeSpecName: "kube-api-access-t77rr") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "kube-api-access-t77rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.130286 4778 scope.go:117] "RemoveContainer" containerID="d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.152916 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.213775 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.220510 4778 scope.go:117] "RemoveContainer" containerID="7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.220850 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.220881 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.220894 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t77rr\" (UniqueName: \"kubernetes.io/projected/d609db91-e011-4ac4-91a6-a9ba51f3918e-kube-api-access-t77rr\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.220904 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.220916 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.241727 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-config-data" (OuterVolumeSpecName: "config-data") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.260688 4778 scope.go:117] "RemoveContainer" containerID="e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.261217 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9\": container with ID starting with e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9 not found: ID does not exist" containerID="e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.261313 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9"} err="failed to get container status \"e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9\": rpc error: code = NotFound desc = could not find container \"e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9\": container with ID starting with e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9 not found: ID does not exist" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.261404 4778 scope.go:117] "RemoveContainer" containerID="19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.262022 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4\": container with ID starting with 19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4 not found: ID does not exist" containerID="19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.262087 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4"} err="failed to get container status \"19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4\": rpc error: code = NotFound desc = could not find container \"19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4\": container with ID starting with 19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4 not found: ID does not exist" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.262123 4778 scope.go:117] "RemoveContainer" containerID="d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.262322 4778 scope.go:117] "RemoveContainer" containerID="392df7bab826882632f84664710c42da5399ea661a1dbfcd15aec0ad5d248553" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.262590 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053\": container with ID starting with d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053 not found: ID does not exist" containerID="d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.262646 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053"} err="failed to get container status \"d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053\": rpc error: code = NotFound desc = could not find container \"d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053\": container with ID starting with d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053 not found: ID does not exist" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.262677 4778 scope.go:117] "RemoveContainer" containerID="7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.263107 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372\": container with ID starting with 7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372 not found: ID does not exist" containerID="7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.263189 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372"} err="failed to get container status \"7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372\": rpc error: code = NotFound desc = could not find container \"7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372\": container with ID starting with 7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372 not found: ID does not exist" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.322849 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.405889 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.416837 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.433358 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.434111 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-notification-agent" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.434234 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-notification-agent" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.434332 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="proxy-httpd" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.434410 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="proxy-httpd" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.434487 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-central-agent" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.434536 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-central-agent" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.434601 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="sg-core" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.434725 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="sg-core" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.434977 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-central-agent" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.435038 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="sg-core" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.435097 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="proxy-httpd" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.435161 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-notification-agent" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.436891 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.439943 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.440833 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.450326 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.629430 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-scripts\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.629712 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-config-data\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.629850 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-run-httpd\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.630040 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.630170 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66lc2\" (UniqueName: \"kubernetes.io/projected/d1285385-097b-434c-8e95-dc27069185e1-kube-api-access-66lc2\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.630285 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-log-httpd\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.630399 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.732429 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.732722 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66lc2\" (UniqueName: \"kubernetes.io/projected/d1285385-097b-434c-8e95-dc27069185e1-kube-api-access-66lc2\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.732826 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-log-httpd\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.732895 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.733002 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-scripts\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.733072 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-config-data\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.733153 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-run-httpd\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.733221 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-log-httpd\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.733503 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-run-httpd\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.736504 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.736813 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.736881 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-scripts\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.738000 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-config-data\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.752217 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66lc2\" (UniqueName: \"kubernetes.io/projected/d1285385-097b-434c-8e95-dc27069185e1-kube-api-access-66lc2\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.773240 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:28 crc kubenswrapper[4778]: I0318 09:24:28.196111 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" path="/var/lib/kubelet/pods/d609db91-e011-4ac4-91a6-a9ba51f3918e/volumes" Mar 18 09:24:28 crc kubenswrapper[4778]: I0318 09:24:28.238963 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:28 crc kubenswrapper[4778]: I0318 09:24:28.798509 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.094937 4778 generic.go:334] "Generic (PLEG): container finished" podID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerID="d66d99949ad5120620bffa75ea4a3e49aee89967a841972aa06a4a9867cff673" exitCode=137 Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.095139 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-99c8bfc86-rldfg" event={"ID":"ea1f8a48-d595-4f4e-a740-0af5a26397a5","Type":"ContainerDied","Data":"d66d99949ad5120620bffa75ea4a3e49aee89967a841972aa06a4a9867cff673"} Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.115661 4778 generic.go:334] "Generic (PLEG): container finished" podID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerID="c901fa502e83a3c43053baeb3fceb6b85a02d87dc36f812db5898d2cee7e9935" exitCode=137 Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.115722 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ef4b958-769d-43c2-91d4-3a6cb76d3851","Type":"ContainerDied","Data":"c901fa502e83a3c43053baeb3fceb6b85a02d87dc36f812db5898d2cee7e9935"} Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.118909 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerStarted","Data":"f276f5ce69a40ba1466b6ac8bed74eb71e6d22c683684e8887ac3793fd97f1ea"} Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.233462 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.362906 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef4b958-769d-43c2-91d4-3a6cb76d3851-logs\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363318 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363359 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data-custom\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363416 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-scripts\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363438 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ef4b958-769d-43c2-91d4-3a6cb76d3851-etc-machine-id\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363469 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-combined-ca-bundle\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363499 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vxxv\" (UniqueName: \"kubernetes.io/projected/7ef4b958-769d-43c2-91d4-3a6cb76d3851-kube-api-access-7vxxv\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363771 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef4b958-769d-43c2-91d4-3a6cb76d3851-logs" (OuterVolumeSpecName: "logs") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.365379 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ef4b958-769d-43c2-91d4-3a6cb76d3851-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.373095 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.373442 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef4b958-769d-43c2-91d4-3a6cb76d3851-kube-api-access-7vxxv" (OuterVolumeSpecName: "kube-api-access-7vxxv") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "kube-api-access-7vxxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.379235 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-scripts" (OuterVolumeSpecName: "scripts") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.402812 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.437118 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data" (OuterVolumeSpecName: "config-data") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467064 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef4b958-769d-43c2-91d4-3a6cb76d3851-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467090 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467103 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467114 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467125 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ef4b958-769d-43c2-91d4-3a6cb76d3851-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467134 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467143 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vxxv\" (UniqueName: \"kubernetes.io/projected/7ef4b958-769d-43c2-91d4-3a6cb76d3851-kube-api-access-7vxxv\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.811493 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994038 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpxvx\" (UniqueName: \"kubernetes.io/projected/ea1f8a48-d595-4f4e-a740-0af5a26397a5-kube-api-access-wpxvx\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994216 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1f8a48-d595-4f4e-a740-0af5a26397a5-logs\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994286 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-combined-ca-bundle\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994387 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-scripts\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994412 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-config-data\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994469 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-tls-certs\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994497 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-secret-key\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994966 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea1f8a48-d595-4f4e-a740-0af5a26397a5-logs" (OuterVolumeSpecName: "logs") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.000415 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.002791 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea1f8a48-d595-4f4e-a740-0af5a26397a5-kube-api-access-wpxvx" (OuterVolumeSpecName: "kube-api-access-wpxvx") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "kube-api-access-wpxvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.020794 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-scripts" (OuterVolumeSpecName: "scripts") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.077956 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-config-data" (OuterVolumeSpecName: "config-data") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.080702 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.098830 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.098876 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.098891 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.098907 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpxvx\" (UniqueName: \"kubernetes.io/projected/ea1f8a48-d595-4f4e-a740-0af5a26397a5-kube-api-access-wpxvx\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.098919 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1f8a48-d595-4f4e-a740-0af5a26397a5-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.098928 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.124543 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.158189 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerStarted","Data":"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e"} Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.168278 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-99c8bfc86-rldfg" event={"ID":"ea1f8a48-d595-4f4e-a740-0af5a26397a5","Type":"ContainerDied","Data":"c77fd4278a90c239273c01a79ef12824477ebf4a1fc89be85a96364b2e982560"} Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.168335 4778 scope.go:117] "RemoveContainer" containerID="c97b0574c01538fa1c2084f0fe2463a0e870b56db44f51730acba68dbd0ebf25" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.168433 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.177866 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ef4b958-769d-43c2-91d4-3a6cb76d3851","Type":"ContainerDied","Data":"158c8e2542c97971367812456984ca4e3f98182f67d2f9c5b6c2354ec14b4a85"} Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.177974 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.200422 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.274030 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-99c8bfc86-rldfg"] Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.305880 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-99c8bfc86-rldfg"] Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.325676 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.339413 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.349331 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:24:30 crc kubenswrapper[4778]: E0318 09:24:30.349873 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api-log" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.349889 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api-log" Mar 18 09:24:30 crc kubenswrapper[4778]: E0318 09:24:30.349914 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.349921 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" Mar 18 09:24:30 crc kubenswrapper[4778]: E0318 09:24:30.349930 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon-log" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.349937 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon-log" Mar 18 09:24:30 crc kubenswrapper[4778]: E0318 09:24:30.349950 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.349956 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.350165 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.350178 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon-log" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.350213 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api-log" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.350227 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.351322 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.356415 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.356567 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.357218 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.363226 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.388529 4778 scope.go:117] "RemoveContainer" containerID="d66d99949ad5120620bffa75ea4a3e49aee89967a841972aa06a4a9867cff673" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.410986 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-etc-machine-id\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-config-data\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411122 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-logs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411154 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-config-data-custom\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411206 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wmzv\" (UniqueName: \"kubernetes.io/projected/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-kube-api-access-7wmzv\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411241 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411268 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-scripts\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411287 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-public-tls-certs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.430321 4778 scope.go:117] "RemoveContainer" containerID="c901fa502e83a3c43053baeb3fceb6b85a02d87dc36f812db5898d2cee7e9935" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.463356 4778 scope.go:117] "RemoveContainer" containerID="825b48041531c5b0c01f329609e8aff59582830809964a4de416fba30a539f48" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.512309 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-config-data-custom\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.512394 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wmzv\" (UniqueName: \"kubernetes.io/projected/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-kube-api-access-7wmzv\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.512439 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.512473 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-scripts\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.512495 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-public-tls-certs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.512520 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.513240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-etc-machine-id\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.513268 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-config-data\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.513431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-etc-machine-id\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.514082 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-logs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.514388 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-logs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.517010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-public-tls-certs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.517431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.517599 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-scripts\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.519568 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.519619 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-config-data-custom\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.522243 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-config-data\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.534877 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wmzv\" (UniqueName: \"kubernetes.io/projected/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-kube-api-access-7wmzv\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.677893 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.128568 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.204097 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerStarted","Data":"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46"} Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.223157 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.225841 4778 generic.go:334] "Generic (PLEG): container finished" podID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerID="5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827" exitCode=0 Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.225882 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cffc84f44-vtx7x" event={"ID":"3d1d399f-3c89-4aaa-bba1-ce1a91358455","Type":"ContainerDied","Data":"5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827"} Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.225911 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cffc84f44-vtx7x" event={"ID":"3d1d399f-3c89-4aaa-bba1-ce1a91358455","Type":"ContainerDied","Data":"e444d25da091179b3622d7408ec0b6e7caa7c81b27414dca1d4252c8b3fb5441"} Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.225928 4778 scope.go:117] "RemoveContainer" containerID="6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.226050 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.231863 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-config\") pod \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.231917 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-httpd-config\") pod \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.231976 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnk27\" (UniqueName: \"kubernetes.io/projected/3d1d399f-3c89-4aaa-bba1-ce1a91358455-kube-api-access-qnk27\") pod \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.232003 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-combined-ca-bundle\") pod \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.232149 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-ovndb-tls-certs\") pod \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.238319 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3d1d399f-3c89-4aaa-bba1-ce1a91358455" (UID: "3d1d399f-3c89-4aaa-bba1-ce1a91358455"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.245358 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1d399f-3c89-4aaa-bba1-ce1a91358455-kube-api-access-qnk27" (OuterVolumeSpecName: "kube-api-access-qnk27") pod "3d1d399f-3c89-4aaa-bba1-ce1a91358455" (UID: "3d1d399f-3c89-4aaa-bba1-ce1a91358455"). InnerVolumeSpecName "kube-api-access-qnk27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.299987 4778 scope.go:117] "RemoveContainer" containerID="5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.334508 4778 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.334568 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnk27\" (UniqueName: \"kubernetes.io/projected/3d1d399f-3c89-4aaa-bba1-ce1a91358455-kube-api-access-qnk27\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.338046 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-config" (OuterVolumeSpecName: "config") pod "3d1d399f-3c89-4aaa-bba1-ce1a91358455" (UID: "3d1d399f-3c89-4aaa-bba1-ce1a91358455"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.363850 4778 scope.go:117] "RemoveContainer" containerID="6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3" Mar 18 09:24:31 crc kubenswrapper[4778]: E0318 09:24:31.364886 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3\": container with ID starting with 6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3 not found: ID does not exist" containerID="6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.364930 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3"} err="failed to get container status \"6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3\": rpc error: code = NotFound desc = could not find container \"6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3\": container with ID starting with 6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3 not found: ID does not exist" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.364962 4778 scope.go:117] "RemoveContainer" containerID="5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827" Mar 18 09:24:31 crc kubenswrapper[4778]: E0318 09:24:31.365406 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827\": container with ID starting with 5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827 not found: ID does not exist" containerID="5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.365440 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827"} err="failed to get container status \"5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827\": rpc error: code = NotFound desc = could not find container \"5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827\": container with ID starting with 5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827 not found: ID does not exist" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.369555 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d1d399f-3c89-4aaa-bba1-ce1a91358455" (UID: "3d1d399f-3c89-4aaa-bba1-ce1a91358455"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.389387 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3d1d399f-3c89-4aaa-bba1-ce1a91358455" (UID: "3d1d399f-3c89-4aaa-bba1-ce1a91358455"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.436466 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.436516 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.436531 4778 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.562446 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cffc84f44-vtx7x"] Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.570997 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-cffc84f44-vtx7x"] Mar 18 09:24:32 crc kubenswrapper[4778]: I0318 09:24:32.198239 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" path="/var/lib/kubelet/pods/3d1d399f-3c89-4aaa-bba1-ce1a91358455/volumes" Mar 18 09:24:32 crc kubenswrapper[4778]: I0318 09:24:32.199326 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" path="/var/lib/kubelet/pods/7ef4b958-769d-43c2-91d4-3a6cb76d3851/volumes" Mar 18 09:24:32 crc kubenswrapper[4778]: I0318 09:24:32.199987 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" path="/var/lib/kubelet/pods/ea1f8a48-d595-4f4e-a740-0af5a26397a5/volumes" Mar 18 09:24:32 crc kubenswrapper[4778]: I0318 09:24:32.237982 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerStarted","Data":"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8"} Mar 18 09:24:32 crc kubenswrapper[4778]: I0318 09:24:32.241720 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51","Type":"ContainerStarted","Data":"f62b79dc1c13ddbdcdf1ff9979e4795dab48a8b05a7a40c92da85226bd3dfb12"} Mar 18 09:24:32 crc kubenswrapper[4778]: I0318 09:24:32.241764 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51","Type":"ContainerStarted","Data":"7ad59e74d5727c6448c429922f9615f73c95156d0fab49db22041f221c72e6d7"} Mar 18 09:24:33 crc kubenswrapper[4778]: I0318 09:24:33.265620 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51","Type":"ContainerStarted","Data":"cf14eeba4375e3a27d3011a3de7d5dc3a5ef6779728ba976290726eeff2d4ccd"} Mar 18 09:24:33 crc kubenswrapper[4778]: I0318 09:24:33.266386 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.233693 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.233625846 podStartE2EDuration="4.233625846s" podCreationTimestamp="2026-03-18 09:24:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:24:33.295937015 +0000 UTC m=+1339.870682055" watchObservedRunningTime="2026-03-18 09:24:34.233625846 +0000 UTC m=+1340.808370696" Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.296723 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-central-agent" containerID="cri-o://1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" gracePeriod=30 Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.297115 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerStarted","Data":"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64"} Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.297185 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.297645 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="proxy-httpd" containerID="cri-o://1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" gracePeriod=30 Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.297708 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="sg-core" containerID="cri-o://d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" gracePeriod=30 Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.297764 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-notification-agent" containerID="cri-o://98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" gracePeriod=30 Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.331037 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.42741177 podStartE2EDuration="7.331007079s" podCreationTimestamp="2026-03-18 09:24:27 +0000 UTC" firstStartedPulling="2026-03-18 09:24:28.248808565 +0000 UTC m=+1334.823553405" lastFinishedPulling="2026-03-18 09:24:33.152403874 +0000 UTC m=+1339.727148714" observedRunningTime="2026-03-18 09:24:34.322263801 +0000 UTC m=+1340.897008661" watchObservedRunningTime="2026-03-18 09:24:34.331007079 +0000 UTC m=+1340.905751959" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.147670 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307747 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1285385-097b-434c-8e95-dc27069185e1" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" exitCode=0 Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307777 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1285385-097b-434c-8e95-dc27069185e1" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" exitCode=2 Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307788 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1285385-097b-434c-8e95-dc27069185e1" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" exitCode=0 Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307796 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1285385-097b-434c-8e95-dc27069185e1" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" exitCode=0 Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307790 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerDied","Data":"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64"} Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307839 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerDied","Data":"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8"} Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerDied","Data":"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46"} Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307846 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307871 4778 scope.go:117] "RemoveContainer" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307860 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerDied","Data":"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e"} Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307987 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerDied","Data":"f276f5ce69a40ba1466b6ac8bed74eb71e6d22c683684e8887ac3793fd97f1ea"} Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.311920 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66lc2\" (UniqueName: \"kubernetes.io/projected/d1285385-097b-434c-8e95-dc27069185e1-kube-api-access-66lc2\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312013 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-run-httpd\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312045 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-sg-core-conf-yaml\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312155 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-config-data\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312256 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-scripts\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312289 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-log-httpd\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312322 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-combined-ca-bundle\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312616 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312648 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.313763 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.313947 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.316967 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1285385-097b-434c-8e95-dc27069185e1-kube-api-access-66lc2" (OuterVolumeSpecName: "kube-api-access-66lc2") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "kube-api-access-66lc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.317359 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-scripts" (OuterVolumeSpecName: "scripts") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.328886 4778 scope.go:117] "RemoveContainer" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.336940 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.349365 4778 scope.go:117] "RemoveContainer" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.379018 4778 scope.go:117] "RemoveContainer" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.387631 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.408955 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-config-data" (OuterVolumeSpecName: "config-data") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.411465 4778 scope.go:117] "RemoveContainer" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.412150 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": container with ID starting with 1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64 not found: ID does not exist" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.412273 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64"} err="failed to get container status \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": rpc error: code = NotFound desc = could not find container \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": container with ID starting with 1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.412318 4778 scope.go:117] "RemoveContainer" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.412844 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": container with ID starting with d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8 not found: ID does not exist" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.412884 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8"} err="failed to get container status \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": rpc error: code = NotFound desc = could not find container \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": container with ID starting with d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.412906 4778 scope.go:117] "RemoveContainer" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.413400 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": container with ID starting with 98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46 not found: ID does not exist" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.413447 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46"} err="failed to get container status \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": rpc error: code = NotFound desc = could not find container \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": container with ID starting with 98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.413471 4778 scope.go:117] "RemoveContainer" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.413904 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": container with ID starting with 1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e not found: ID does not exist" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.413934 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e"} err="failed to get container status \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": rpc error: code = NotFound desc = could not find container \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": container with ID starting with 1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.413954 4778 scope.go:117] "RemoveContainer" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.414497 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64"} err="failed to get container status \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": rpc error: code = NotFound desc = could not find container \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": container with ID starting with 1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.414549 4778 scope.go:117] "RemoveContainer" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.414895 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8"} err="failed to get container status \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": rpc error: code = NotFound desc = could not find container \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": container with ID starting with d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.414919 4778 scope.go:117] "RemoveContainer" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.415287 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46"} err="failed to get container status \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": rpc error: code = NotFound desc = could not find container \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": container with ID starting with 98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.415391 4778 scope.go:117] "RemoveContainer" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.415818 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e"} err="failed to get container status \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": rpc error: code = NotFound desc = could not find container \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": container with ID starting with 1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.415841 4778 scope.go:117] "RemoveContainer" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.415855 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.415887 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66lc2\" (UniqueName: \"kubernetes.io/projected/d1285385-097b-434c-8e95-dc27069185e1-kube-api-access-66lc2\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416107 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416123 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416135 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416316 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64"} err="failed to get container status \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": rpc error: code = NotFound desc = could not find container \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": container with ID starting with 1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416336 4778 scope.go:117] "RemoveContainer" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416679 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8"} err="failed to get container status \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": rpc error: code = NotFound desc = could not find container \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": container with ID starting with d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416698 4778 scope.go:117] "RemoveContainer" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417019 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46"} err="failed to get container status \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": rpc error: code = NotFound desc = could not find container \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": container with ID starting with 98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417038 4778 scope.go:117] "RemoveContainer" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417333 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e"} err="failed to get container status \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": rpc error: code = NotFound desc = could not find container \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": container with ID starting with 1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417351 4778 scope.go:117] "RemoveContainer" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417655 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64"} err="failed to get container status \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": rpc error: code = NotFound desc = could not find container \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": container with ID starting with 1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417675 4778 scope.go:117] "RemoveContainer" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417981 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8"} err="failed to get container status \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": rpc error: code = NotFound desc = could not find container \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": container with ID starting with d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.418000 4778 scope.go:117] "RemoveContainer" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.418369 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46"} err="failed to get container status \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": rpc error: code = NotFound desc = could not find container \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": container with ID starting with 98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.418414 4778 scope.go:117] "RemoveContainer" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.418756 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e"} err="failed to get container status \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": rpc error: code = NotFound desc = could not find container \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": container with ID starting with 1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.640064 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.649269 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678309 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.678753 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="sg-core" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678774 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="sg-core" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.678804 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-notification-agent" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678812 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-notification-agent" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.678830 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="proxy-httpd" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678838 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="proxy-httpd" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.678853 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-central-agent" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678861 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-central-agent" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.678878 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-httpd" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678886 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-httpd" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.678909 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-api" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678917 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-api" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.679113 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-notification-agent" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.679130 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-api" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.679145 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="sg-core" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.679160 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-httpd" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.679174 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-central-agent" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.679191 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="proxy-httpd" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.681702 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.684802 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.688354 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.702854 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824073 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-config-data\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824234 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-scripts\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824268 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824301 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-log-httpd\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824325 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-run-httpd\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824350 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824376 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bstbn\" (UniqueName: \"kubernetes.io/projected/7bbbb56d-564a-45df-b50c-7bc4ba290812-kube-api-access-bstbn\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.926533 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-config-data\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.926703 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-scripts\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.926764 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.927376 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-log-httpd\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.927410 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-run-httpd\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.927429 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.927444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bstbn\" (UniqueName: \"kubernetes.io/projected/7bbbb56d-564a-45df-b50c-7bc4ba290812-kube-api-access-bstbn\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.927713 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-run-httpd\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.927877 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-log-httpd\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.931101 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.932905 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.935021 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-scripts\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.947349 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-config-data\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.952488 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bstbn\" (UniqueName: \"kubernetes.io/projected/7bbbb56d-564a-45df-b50c-7bc4ba290812-kube-api-access-bstbn\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:36 crc kubenswrapper[4778]: I0318 09:24:36.025729 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:36 crc kubenswrapper[4778]: I0318 09:24:36.212706 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1285385-097b-434c-8e95-dc27069185e1" path="/var/lib/kubelet/pods/d1285385-097b-434c-8e95-dc27069185e1/volumes" Mar 18 09:24:36 crc kubenswrapper[4778]: I0318 09:24:36.486097 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:37 crc kubenswrapper[4778]: I0318 09:24:37.337857 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerStarted","Data":"4c0e3af08cf9b31b7b8c65fe63263681dd011db44c530c1549fcf2d8b00c430f"} Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.115469 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-nl2dg"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.116820 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.123857 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-nl2dg"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.221597 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9hlqk"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.223172 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.237067 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9hlqk"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.273099 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjp78\" (UniqueName: \"kubernetes.io/projected/8341ceba-13e0-410f-a7d2-23190a07d914-kube-api-access-xjp78\") pod \"nova-api-db-create-nl2dg\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.273379 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8341ceba-13e0-410f-a7d2-23190a07d914-operator-scripts\") pod \"nova-api-db-create-nl2dg\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.303916 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-t5x58"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.304938 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.314382 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t5x58"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.326242 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-922c-account-create-update-6z2xf"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.327388 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.329040 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.336576 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-922c-account-create-update-6z2xf"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.376422 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjp78\" (UniqueName: \"kubernetes.io/projected/8341ceba-13e0-410f-a7d2-23190a07d914-kube-api-access-xjp78\") pod \"nova-api-db-create-nl2dg\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.376556 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8341ceba-13e0-410f-a7d2-23190a07d914-operator-scripts\") pod \"nova-api-db-create-nl2dg\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.376625 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ccbt\" (UniqueName: \"kubernetes.io/projected/92444732-2d3e-4065-a336-74b37b711530-kube-api-access-5ccbt\") pod \"nova-cell0-db-create-9hlqk\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.376660 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92444732-2d3e-4065-a336-74b37b711530-operator-scripts\") pod \"nova-cell0-db-create-9hlqk\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.379932 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8341ceba-13e0-410f-a7d2-23190a07d914-operator-scripts\") pod \"nova-api-db-create-nl2dg\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.402097 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjp78\" (UniqueName: \"kubernetes.io/projected/8341ceba-13e0-410f-a7d2-23190a07d914-kube-api-access-xjp78\") pod \"nova-api-db-create-nl2dg\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.441070 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.478735 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-operator-scripts\") pod \"nova-api-922c-account-create-update-6z2xf\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.478823 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b380dfb3-b55b-4db2-bd8f-a90b4470345d-operator-scripts\") pod \"nova-cell1-db-create-t5x58\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.478859 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-kube-api-access-4xcpw\") pod \"nova-api-922c-account-create-update-6z2xf\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.478904 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t86pm\" (UniqueName: \"kubernetes.io/projected/b380dfb3-b55b-4db2-bd8f-a90b4470345d-kube-api-access-t86pm\") pod \"nova-cell1-db-create-t5x58\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.478933 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ccbt\" (UniqueName: \"kubernetes.io/projected/92444732-2d3e-4065-a336-74b37b711530-kube-api-access-5ccbt\") pod \"nova-cell0-db-create-9hlqk\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.478961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92444732-2d3e-4065-a336-74b37b711530-operator-scripts\") pod \"nova-cell0-db-create-9hlqk\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.479832 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92444732-2d3e-4065-a336-74b37b711530-operator-scripts\") pod \"nova-cell0-db-create-9hlqk\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.498256 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ccbt\" (UniqueName: \"kubernetes.io/projected/92444732-2d3e-4065-a336-74b37b711530-kube-api-access-5ccbt\") pod \"nova-cell0-db-create-9hlqk\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.519639 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-722f-account-create-update-slwd5"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.520644 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.522757 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.539330 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.546145 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-722f-account-create-update-slwd5"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.601270 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-operator-scripts\") pod \"nova-api-922c-account-create-update-6z2xf\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.601525 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b380dfb3-b55b-4db2-bd8f-a90b4470345d-operator-scripts\") pod \"nova-cell1-db-create-t5x58\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.601572 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-kube-api-access-4xcpw\") pod \"nova-api-922c-account-create-update-6z2xf\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.601640 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t86pm\" (UniqueName: \"kubernetes.io/projected/b380dfb3-b55b-4db2-bd8f-a90b4470345d-kube-api-access-t86pm\") pod \"nova-cell1-db-create-t5x58\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.604058 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b380dfb3-b55b-4db2-bd8f-a90b4470345d-operator-scripts\") pod \"nova-cell1-db-create-t5x58\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.633932 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-operator-scripts\") pod \"nova-api-922c-account-create-update-6z2xf\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.637572 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-kube-api-access-4xcpw\") pod \"nova-api-922c-account-create-update-6z2xf\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.642572 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t86pm\" (UniqueName: \"kubernetes.io/projected/b380dfb3-b55b-4db2-bd8f-a90b4470345d-kube-api-access-t86pm\") pod \"nova-cell1-db-create-t5x58\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.649773 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.671515 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b9f9-account-create-update-w7q2n"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.672943 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.676328 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.692730 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b9f9-account-create-update-w7q2n"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.704084 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f06b776-36bc-45ba-88d4-69608f9665e6-operator-scripts\") pod \"nova-cell0-722f-account-create-update-slwd5\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.704177 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7plt5\" (UniqueName: \"kubernetes.io/projected/2f06b776-36bc-45ba-88d4-69608f9665e6-kube-api-access-7plt5\") pod \"nova-cell0-722f-account-create-update-slwd5\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.805366 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f06b776-36bc-45ba-88d4-69608f9665e6-operator-scripts\") pod \"nova-cell0-722f-account-create-update-slwd5\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.805449 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r58f4\" (UniqueName: \"kubernetes.io/projected/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-kube-api-access-r58f4\") pod \"nova-cell1-b9f9-account-create-update-w7q2n\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.805478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7plt5\" (UniqueName: \"kubernetes.io/projected/2f06b776-36bc-45ba-88d4-69608f9665e6-kube-api-access-7plt5\") pod \"nova-cell0-722f-account-create-update-slwd5\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.805540 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-operator-scripts\") pod \"nova-cell1-b9f9-account-create-update-w7q2n\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.807511 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f06b776-36bc-45ba-88d4-69608f9665e6-operator-scripts\") pod \"nova-cell0-722f-account-create-update-slwd5\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.833875 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7plt5\" (UniqueName: \"kubernetes.io/projected/2f06b776-36bc-45ba-88d4-69608f9665e6-kube-api-access-7plt5\") pod \"nova-cell0-722f-account-create-update-slwd5\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.906680 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r58f4\" (UniqueName: \"kubernetes.io/projected/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-kube-api-access-r58f4\") pod \"nova-cell1-b9f9-account-create-update-w7q2n\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.906773 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-operator-scripts\") pod \"nova-cell1-b9f9-account-create-update-w7q2n\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.908735 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-operator-scripts\") pod \"nova-cell1-b9f9-account-create-update-w7q2n\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.920681 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.927609 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r58f4\" (UniqueName: \"kubernetes.io/projected/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-kube-api-access-r58f4\") pod \"nova-cell1-b9f9-account-create-update-w7q2n\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.105478 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.105968 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-nl2dg"] Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.117010 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.212139 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9hlqk"] Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.224927 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-922c-account-create-update-6z2xf"] Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.393368 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9hlqk" event={"ID":"92444732-2d3e-4065-a336-74b37b711530","Type":"ContainerStarted","Data":"3333452c0f9c0aa6c9b681d6899b93de37fd7ca4bbeb37555194646d5e14f4aa"} Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.409529 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-922c-account-create-update-6z2xf" event={"ID":"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e","Type":"ContainerStarted","Data":"f6194e96f072cfd53c406f39a7550f29db5d67bdc6367c173d9028ecd67df647"} Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.434520 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerStarted","Data":"2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59"} Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.438345 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t5x58"] Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.442331 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nl2dg" event={"ID":"8341ceba-13e0-410f-a7d2-23190a07d914","Type":"ContainerStarted","Data":"d2ab977039aaae5c8f1427bcc52b3dcaad0308d3866ed320458f83ac1d2b6005"} Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.691315 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-nl2dg" podStartSLOduration=1.6912805789999998 podStartE2EDuration="1.691280579s" podCreationTimestamp="2026-03-18 09:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:24:39.466791263 +0000 UTC m=+1346.041536113" watchObservedRunningTime="2026-03-18 09:24:39.691280579 +0000 UTC m=+1346.266025419" Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.699143 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b9f9-account-create-update-w7q2n"] Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.797631 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-722f-account-create-update-slwd5"] Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.452731 4778 generic.go:334] "Generic (PLEG): container finished" podID="b380dfb3-b55b-4db2-bd8f-a90b4470345d" containerID="c118c28760c4816bb842a36e485ff938333b6ae9902cf9242267aa191e3d70bf" exitCode=0 Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.453758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t5x58" event={"ID":"b380dfb3-b55b-4db2-bd8f-a90b4470345d","Type":"ContainerDied","Data":"c118c28760c4816bb842a36e485ff938333b6ae9902cf9242267aa191e3d70bf"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.453800 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t5x58" event={"ID":"b380dfb3-b55b-4db2-bd8f-a90b4470345d","Type":"ContainerStarted","Data":"2cf69c2fb75c3f078b4c9042bf5f15177ce33c7cf16cfe38dde0358c80b6c73f"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.469815 4778 generic.go:334] "Generic (PLEG): container finished" podID="92444732-2d3e-4065-a336-74b37b711530" containerID="7fb36f99fa48f9c60dbdcb8445fed2d769e9cb712ffc10c71b7ff46632229d69" exitCode=0 Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.469895 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9hlqk" event={"ID":"92444732-2d3e-4065-a336-74b37b711530","Type":"ContainerDied","Data":"7fb36f99fa48f9c60dbdcb8445fed2d769e9cb712ffc10c71b7ff46632229d69"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.479806 4778 generic.go:334] "Generic (PLEG): container finished" podID="b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" containerID="3cc34e35f2db07df2220b6c334d24c112405b578f89727d873a592536bc78998" exitCode=0 Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.480155 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-922c-account-create-update-6z2xf" event={"ID":"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e","Type":"ContainerDied","Data":"3cc34e35f2db07df2220b6c334d24c112405b578f89727d873a592536bc78998"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.489383 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerStarted","Data":"5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.494748 4778 generic.go:334] "Generic (PLEG): container finished" podID="2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" containerID="47bfce503465075386d4ab81517eb08824a50d2ca76a4ab55639a7aea5948d36" exitCode=0 Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.494845 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" event={"ID":"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d","Type":"ContainerDied","Data":"47bfce503465075386d4ab81517eb08824a50d2ca76a4ab55639a7aea5948d36"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.494874 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" event={"ID":"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d","Type":"ContainerStarted","Data":"50d29b828e5b4eae76fdf54a3bce80608ad76949a8ff5e9b561728be802b7516"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.510473 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nl2dg" event={"ID":"8341ceba-13e0-410f-a7d2-23190a07d914","Type":"ContainerDied","Data":"9866f0cece8384eb6d69125fd4f2648001a15f8207d97598a6f6b380c668253f"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.520691 4778 generic.go:334] "Generic (PLEG): container finished" podID="8341ceba-13e0-410f-a7d2-23190a07d914" containerID="9866f0cece8384eb6d69125fd4f2648001a15f8207d97598a6f6b380c668253f" exitCode=0 Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.531045 4778 generic.go:334] "Generic (PLEG): container finished" podID="2f06b776-36bc-45ba-88d4-69608f9665e6" containerID="b0ad59dfbfbe8f98b2a7024fc11350f06ab712f37850bffb7121c440c9344960" exitCode=0 Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.531104 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-722f-account-create-update-slwd5" event={"ID":"2f06b776-36bc-45ba-88d4-69608f9665e6","Type":"ContainerDied","Data":"b0ad59dfbfbe8f98b2a7024fc11350f06ab712f37850bffb7121c440c9344960"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.531134 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-722f-account-create-update-slwd5" event={"ID":"2f06b776-36bc-45ba-88d4-69608f9665e6","Type":"ContainerStarted","Data":"7ee8c9776efd482ad7a36b5754dd4923536b7bbd44753fa87fbbf5f1cfa338b4"} Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.130874 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.215463 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.295954 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7b57877776-ssjzt"] Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.296357 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7b57877776-ssjzt" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-log" containerID="cri-o://8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7" gracePeriod=30 Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.296455 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7b57877776-ssjzt" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-api" containerID="cri-o://2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f" gracePeriod=30 Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.543921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerStarted","Data":"4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe"} Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.547063 4778 generic.go:334] "Generic (PLEG): container finished" podID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerID="8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7" exitCode=143 Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.547329 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57877776-ssjzt" event={"ID":"9c5e9a8c-649d-4d20-b867-bb0f801d329d","Type":"ContainerDied","Data":"8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7"} Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.966334 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.086351 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" (UID: "b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.089342 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-operator-scripts\") pod \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.089517 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-kube-api-access-4xcpw\") pod \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.091056 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.118213 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-kube-api-access-4xcpw" (OuterVolumeSpecName: "kube-api-access-4xcpw") pod "b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" (UID: "b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e"). InnerVolumeSpecName "kube-api-access-4xcpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.193655 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-kube-api-access-4xcpw\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.229946 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.261524 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.271186 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.285254 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.294319 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.397624 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b380dfb3-b55b-4db2-bd8f-a90b4470345d-operator-scripts\") pod \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.397704 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t86pm\" (UniqueName: \"kubernetes.io/projected/b380dfb3-b55b-4db2-bd8f-a90b4470345d-kube-api-access-t86pm\") pod \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.397737 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ccbt\" (UniqueName: \"kubernetes.io/projected/92444732-2d3e-4065-a336-74b37b711530-kube-api-access-5ccbt\") pod \"92444732-2d3e-4065-a336-74b37b711530\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.397812 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7plt5\" (UniqueName: \"kubernetes.io/projected/2f06b776-36bc-45ba-88d4-69608f9665e6-kube-api-access-7plt5\") pod \"2f06b776-36bc-45ba-88d4-69608f9665e6\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.397838 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f06b776-36bc-45ba-88d4-69608f9665e6-operator-scripts\") pod \"2f06b776-36bc-45ba-88d4-69608f9665e6\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.397943 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjp78\" (UniqueName: \"kubernetes.io/projected/8341ceba-13e0-410f-a7d2-23190a07d914-kube-api-access-xjp78\") pod \"8341ceba-13e0-410f-a7d2-23190a07d914\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.398103 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8341ceba-13e0-410f-a7d2-23190a07d914-operator-scripts\") pod \"8341ceba-13e0-410f-a7d2-23190a07d914\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.398131 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r58f4\" (UniqueName: \"kubernetes.io/projected/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-kube-api-access-r58f4\") pod \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.398170 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-operator-scripts\") pod \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.398227 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92444732-2d3e-4065-a336-74b37b711530-operator-scripts\") pod \"92444732-2d3e-4065-a336-74b37b711530\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.399098 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8341ceba-13e0-410f-a7d2-23190a07d914-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8341ceba-13e0-410f-a7d2-23190a07d914" (UID: "8341ceba-13e0-410f-a7d2-23190a07d914"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.399417 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f06b776-36bc-45ba-88d4-69608f9665e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f06b776-36bc-45ba-88d4-69608f9665e6" (UID: "2f06b776-36bc-45ba-88d4-69608f9665e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.399455 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92444732-2d3e-4065-a336-74b37b711530-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92444732-2d3e-4065-a336-74b37b711530" (UID: "92444732-2d3e-4065-a336-74b37b711530"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.399699 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b380dfb3-b55b-4db2-bd8f-a90b4470345d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b380dfb3-b55b-4db2-bd8f-a90b4470345d" (UID: "b380dfb3-b55b-4db2-bd8f-a90b4470345d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.399953 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" (UID: "2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.404419 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-kube-api-access-r58f4" (OuterVolumeSpecName: "kube-api-access-r58f4") pod "2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" (UID: "2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d"). InnerVolumeSpecName "kube-api-access-r58f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.405375 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92444732-2d3e-4065-a336-74b37b711530-kube-api-access-5ccbt" (OuterVolumeSpecName: "kube-api-access-5ccbt") pod "92444732-2d3e-4065-a336-74b37b711530" (UID: "92444732-2d3e-4065-a336-74b37b711530"). InnerVolumeSpecName "kube-api-access-5ccbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.408367 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8341ceba-13e0-410f-a7d2-23190a07d914-kube-api-access-xjp78" (OuterVolumeSpecName: "kube-api-access-xjp78") pod "8341ceba-13e0-410f-a7d2-23190a07d914" (UID: "8341ceba-13e0-410f-a7d2-23190a07d914"). InnerVolumeSpecName "kube-api-access-xjp78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.408453 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b380dfb3-b55b-4db2-bd8f-a90b4470345d-kube-api-access-t86pm" (OuterVolumeSpecName: "kube-api-access-t86pm") pod "b380dfb3-b55b-4db2-bd8f-a90b4470345d" (UID: "b380dfb3-b55b-4db2-bd8f-a90b4470345d"). InnerVolumeSpecName "kube-api-access-t86pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.414319 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f06b776-36bc-45ba-88d4-69608f9665e6-kube-api-access-7plt5" (OuterVolumeSpecName: "kube-api-access-7plt5") pod "2f06b776-36bc-45ba-88d4-69608f9665e6" (UID: "2f06b776-36bc-45ba-88d4-69608f9665e6"). InnerVolumeSpecName "kube-api-access-7plt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499868 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f06b776-36bc-45ba-88d4-69608f9665e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499904 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjp78\" (UniqueName: \"kubernetes.io/projected/8341ceba-13e0-410f-a7d2-23190a07d914-kube-api-access-xjp78\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499916 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8341ceba-13e0-410f-a7d2-23190a07d914-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499925 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r58f4\" (UniqueName: \"kubernetes.io/projected/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-kube-api-access-r58f4\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499934 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499942 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92444732-2d3e-4065-a336-74b37b711530-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499952 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b380dfb3-b55b-4db2-bd8f-a90b4470345d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499961 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t86pm\" (UniqueName: \"kubernetes.io/projected/b380dfb3-b55b-4db2-bd8f-a90b4470345d-kube-api-access-t86pm\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499970 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ccbt\" (UniqueName: \"kubernetes.io/projected/92444732-2d3e-4065-a336-74b37b711530-kube-api-access-5ccbt\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499978 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7plt5\" (UniqueName: \"kubernetes.io/projected/2f06b776-36bc-45ba-88d4-69608f9665e6-kube-api-access-7plt5\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.558256 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-722f-account-create-update-slwd5" event={"ID":"2f06b776-36bc-45ba-88d4-69608f9665e6","Type":"ContainerDied","Data":"7ee8c9776efd482ad7a36b5754dd4923536b7bbd44753fa87fbbf5f1cfa338b4"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.558298 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ee8c9776efd482ad7a36b5754dd4923536b7bbd44753fa87fbbf5f1cfa338b4" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.558350 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.561985 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t5x58" event={"ID":"b380dfb3-b55b-4db2-bd8f-a90b4470345d","Type":"ContainerDied","Data":"2cf69c2fb75c3f078b4c9042bf5f15177ce33c7cf16cfe38dde0358c80b6c73f"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.562010 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.562028 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cf69c2fb75c3f078b4c9042bf5f15177ce33c7cf16cfe38dde0358c80b6c73f" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.564179 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9hlqk" event={"ID":"92444732-2d3e-4065-a336-74b37b711530","Type":"ContainerDied","Data":"3333452c0f9c0aa6c9b681d6899b93de37fd7ca4bbeb37555194646d5e14f4aa"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.564192 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.564221 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3333452c0f9c0aa6c9b681d6899b93de37fd7ca4bbeb37555194646d5e14f4aa" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.565755 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-922c-account-create-update-6z2xf" event={"ID":"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e","Type":"ContainerDied","Data":"f6194e96f072cfd53c406f39a7550f29db5d67bdc6367c173d9028ecd67df647"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.565804 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6194e96f072cfd53c406f39a7550f29db5d67bdc6367c173d9028ecd67df647" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.565769 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.569331 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerStarted","Data":"4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.569781 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.573963 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" event={"ID":"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d","Type":"ContainerDied","Data":"50d29b828e5b4eae76fdf54a3bce80608ad76949a8ff5e9b561728be802b7516"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.573993 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d29b828e5b4eae76fdf54a3bce80608ad76949a8ff5e9b561728be802b7516" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.574044 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.580865 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nl2dg" event={"ID":"8341ceba-13e0-410f-a7d2-23190a07d914","Type":"ContainerDied","Data":"d2ab977039aaae5c8f1427bcc52b3dcaad0308d3866ed320458f83ac1d2b6005"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.580936 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2ab977039aaae5c8f1427bcc52b3dcaad0308d3866ed320458f83ac1d2b6005" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.580958 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.613846 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.059274284 podStartE2EDuration="7.613815856s" podCreationTimestamp="2026-03-18 09:24:35 +0000 UTC" firstStartedPulling="2026-03-18 09:24:36.505734726 +0000 UTC m=+1343.080479566" lastFinishedPulling="2026-03-18 09:24:42.060276298 +0000 UTC m=+1348.635021138" observedRunningTime="2026-03-18 09:24:42.60184264 +0000 UTC m=+1349.176587490" watchObservedRunningTime="2026-03-18 09:24:42.613815856 +0000 UTC m=+1349.188560696" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.856924 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 09:24:44 crc kubenswrapper[4778]: I0318 09:24:44.873879 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:44 crc kubenswrapper[4778]: I0318 09:24:44.874921 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-central-agent" containerID="cri-o://2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59" gracePeriod=30 Mar 18 09:24:44 crc kubenswrapper[4778]: I0318 09:24:44.875052 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="proxy-httpd" containerID="cri-o://4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f" gracePeriod=30 Mar 18 09:24:44 crc kubenswrapper[4778]: I0318 09:24:44.875103 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="sg-core" containerID="cri-o://4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe" gracePeriod=30 Mar 18 09:24:44 crc kubenswrapper[4778]: I0318 09:24:44.875139 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-notification-agent" containerID="cri-o://5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6" gracePeriod=30 Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.247085 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372081 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-scripts\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372151 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-public-tls-certs\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372289 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-config-data\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-internal-tls-certs\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372344 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg9cl\" (UniqueName: \"kubernetes.io/projected/9c5e9a8c-649d-4d20-b867-bb0f801d329d-kube-api-access-jg9cl\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372434 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5e9a8c-649d-4d20-b867-bb0f801d329d-logs\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372457 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-combined-ca-bundle\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.373219 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c5e9a8c-649d-4d20-b867-bb0f801d329d-logs" (OuterVolumeSpecName: "logs") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.378863 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-scripts" (OuterVolumeSpecName: "scripts") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.378920 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5e9a8c-649d-4d20-b867-bb0f801d329d-kube-api-access-jg9cl" (OuterVolumeSpecName: "kube-api-access-jg9cl") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "kube-api-access-jg9cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.424521 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.439745 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-config-data" (OuterVolumeSpecName: "config-data") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.470237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.472755 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474662 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5e9a8c-649d-4d20-b867-bb0f801d329d-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474704 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474730 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474746 4778 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474760 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474778 4778 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474792 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg9cl\" (UniqueName: \"kubernetes.io/projected/9c5e9a8c-649d-4d20-b867-bb0f801d329d-kube-api-access-jg9cl\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.611559 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerID="4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f" exitCode=0 Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.611601 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerID="4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe" exitCode=2 Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.611612 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerID="5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6" exitCode=0 Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.611626 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerDied","Data":"4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f"} Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.611676 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerDied","Data":"4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe"} Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.611688 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerDied","Data":"5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6"} Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.613691 4778 generic.go:334] "Generic (PLEG): container finished" podID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerID="2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f" exitCode=0 Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.613779 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.613750 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57877776-ssjzt" event={"ID":"9c5e9a8c-649d-4d20-b867-bb0f801d329d","Type":"ContainerDied","Data":"2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f"} Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.613938 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57877776-ssjzt" event={"ID":"9c5e9a8c-649d-4d20-b867-bb0f801d329d","Type":"ContainerDied","Data":"dcaa7760f0d0e632c657a22054c5f006fe1f82143f10b41d8d1cb108f3a1621b"} Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.614020 4778 scope.go:117] "RemoveContainer" containerID="2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.649263 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7b57877776-ssjzt"] Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.655774 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7b57877776-ssjzt"] Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.661398 4778 scope.go:117] "RemoveContainer" containerID="8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.687141 4778 scope.go:117] "RemoveContainer" containerID="2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f" Mar 18 09:24:45 crc kubenswrapper[4778]: E0318 09:24:45.688324 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f\": container with ID starting with 2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f not found: ID does not exist" containerID="2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.688384 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f"} err="failed to get container status \"2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f\": rpc error: code = NotFound desc = could not find container \"2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f\": container with ID starting with 2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f not found: ID does not exist" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.688417 4778 scope.go:117] "RemoveContainer" containerID="8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7" Mar 18 09:24:45 crc kubenswrapper[4778]: E0318 09:24:45.689423 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7\": container with ID starting with 8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7 not found: ID does not exist" containerID="8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.689470 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7"} err="failed to get container status \"8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7\": rpc error: code = NotFound desc = could not find container \"8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7\": container with ID starting with 8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7 not found: ID does not exist" Mar 18 09:24:46 crc kubenswrapper[4778]: I0318 09:24:46.206176 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" path="/var/lib/kubelet/pods/9c5e9a8c-649d-4d20-b867-bb0f801d329d/volumes" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.480053 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vhxwz"] Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481123 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481141 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481183 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f06b776-36bc-45ba-88d4-69608f9665e6" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481191 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f06b776-36bc-45ba-88d4-69608f9665e6" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481285 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b380dfb3-b55b-4db2-bd8f-a90b4470345d" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481294 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b380dfb3-b55b-4db2-bd8f-a90b4470345d" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481307 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92444732-2d3e-4065-a336-74b37b711530" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481315 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="92444732-2d3e-4065-a336-74b37b711530" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481323 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8341ceba-13e0-410f-a7d2-23190a07d914" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481331 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8341ceba-13e0-410f-a7d2-23190a07d914" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481340 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-api" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481347 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-api" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481364 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-log" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481372 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-log" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481386 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481394 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481623 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f06b776-36bc-45ba-88d4-69608f9665e6" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481649 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8341ceba-13e0-410f-a7d2-23190a07d914" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481659 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481672 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="92444732-2d3e-4065-a336-74b37b711530" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481681 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b380dfb3-b55b-4db2-bd8f-a90b4470345d" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481690 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-log" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481701 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-api" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481710 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.482489 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.485242 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.486133 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.486850 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pgv4m" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.490391 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vhxwz"] Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.545315 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-config-data\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.545398 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvzwc\" (UniqueName: \"kubernetes.io/projected/8348daa3-112d-49f7-93d8-3649ebf10eee-kube-api-access-cvzwc\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.545427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.545465 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-scripts\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.647456 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvzwc\" (UniqueName: \"kubernetes.io/projected/8348daa3-112d-49f7-93d8-3649ebf10eee-kube-api-access-cvzwc\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.647519 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.647571 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-scripts\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.647727 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-config-data\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.655147 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-config-data\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.655149 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.658739 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-scripts\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.668699 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvzwc\" (UniqueName: \"kubernetes.io/projected/8348daa3-112d-49f7-93d8-3649ebf10eee-kube-api-access-cvzwc\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.805763 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.378334 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vhxwz"] Mar 18 09:24:49 crc kubenswrapper[4778]: W0318 09:24:49.478083 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8348daa3_112d_49f7_93d8_3649ebf10eee.slice/crio-f5bd2fabe74afe68d7b89edc3d7934e5f231bf730e99188b91dab652b2d5599f WatchSource:0}: Error finding container f5bd2fabe74afe68d7b89edc3d7934e5f231bf730e99188b91dab652b2d5599f: Status 404 returned error can't find the container with id f5bd2fabe74afe68d7b89edc3d7934e5f231bf730e99188b91dab652b2d5599f Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.650947 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.664158 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerID="2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59" exitCode=0 Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.664309 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerDied","Data":"2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59"} Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.664343 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.664399 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerDied","Data":"4c0e3af08cf9b31b7b8c65fe63263681dd011db44c530c1549fcf2d8b00c430f"} Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.664427 4778 scope.go:117] "RemoveContainer" containerID="4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.666334 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" event={"ID":"8348daa3-112d-49f7-93d8-3649ebf10eee","Type":"ContainerStarted","Data":"f5bd2fabe74afe68d7b89edc3d7934e5f231bf730e99188b91dab652b2d5599f"} Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.670980 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-sg-core-conf-yaml\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.671106 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-config-data\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.671261 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-scripts\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.671481 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-run-httpd\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.671584 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-log-httpd\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.671629 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bstbn\" (UniqueName: \"kubernetes.io/projected/7bbbb56d-564a-45df-b50c-7bc4ba290812-kube-api-access-bstbn\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.672001 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.672171 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.672869 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-combined-ca-bundle\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.673832 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.673884 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.702548 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-scripts" (OuterVolumeSpecName: "scripts") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.716590 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbbb56d-564a-45df-b50c-7bc4ba290812-kube-api-access-bstbn" (OuterVolumeSpecName: "kube-api-access-bstbn") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "kube-api-access-bstbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.731376 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.772315 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.775075 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.775105 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bstbn\" (UniqueName: \"kubernetes.io/projected/7bbbb56d-564a-45df-b50c-7bc4ba290812-kube-api-access-bstbn\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.775119 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.775128 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.802552 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-config-data" (OuterVolumeSpecName: "config-data") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.871286 4778 scope.go:117] "RemoveContainer" containerID="4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.877894 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.895768 4778 scope.go:117] "RemoveContainer" containerID="5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.919214 4778 scope.go:117] "RemoveContainer" containerID="2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.937702 4778 scope.go:117] "RemoveContainer" containerID="4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f" Mar 18 09:24:49 crc kubenswrapper[4778]: E0318 09:24:49.938284 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f\": container with ID starting with 4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f not found: ID does not exist" containerID="4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.938333 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f"} err="failed to get container status \"4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f\": rpc error: code = NotFound desc = could not find container \"4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f\": container with ID starting with 4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f not found: ID does not exist" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.938362 4778 scope.go:117] "RemoveContainer" containerID="4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe" Mar 18 09:24:49 crc kubenswrapper[4778]: E0318 09:24:49.938866 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe\": container with ID starting with 4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe not found: ID does not exist" containerID="4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.938917 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe"} err="failed to get container status \"4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe\": rpc error: code = NotFound desc = could not find container \"4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe\": container with ID starting with 4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe not found: ID does not exist" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.938945 4778 scope.go:117] "RemoveContainer" containerID="5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6" Mar 18 09:24:49 crc kubenswrapper[4778]: E0318 09:24:49.939529 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6\": container with ID starting with 5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6 not found: ID does not exist" containerID="5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.939560 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6"} err="failed to get container status \"5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6\": rpc error: code = NotFound desc = could not find container \"5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6\": container with ID starting with 5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6 not found: ID does not exist" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.939578 4778 scope.go:117] "RemoveContainer" containerID="2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59" Mar 18 09:24:49 crc kubenswrapper[4778]: E0318 09:24:49.939898 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59\": container with ID starting with 2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59 not found: ID does not exist" containerID="2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.939924 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59"} err="failed to get container status \"2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59\": rpc error: code = NotFound desc = could not find container \"2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59\": container with ID starting with 2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59 not found: ID does not exist" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.046715 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.065067 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.077568 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:50 crc kubenswrapper[4778]: E0318 09:24:50.078255 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-central-agent" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078284 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-central-agent" Mar 18 09:24:50 crc kubenswrapper[4778]: E0318 09:24:50.078371 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-notification-agent" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078385 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-notification-agent" Mar 18 09:24:50 crc kubenswrapper[4778]: E0318 09:24:50.078416 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="sg-core" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078427 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="sg-core" Mar 18 09:24:50 crc kubenswrapper[4778]: E0318 09:24:50.078450 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="proxy-httpd" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078461 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="proxy-httpd" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078714 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-central-agent" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078743 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="proxy-httpd" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078761 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-notification-agent" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078770 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="sg-core" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.081491 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.085435 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.085567 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.087382 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.187922 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-scripts\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.188315 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.188349 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92rb8\" (UniqueName: \"kubernetes.io/projected/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-kube-api-access-92rb8\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.188366 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-log-httpd\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.188409 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.188432 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-run-httpd\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.188458 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-config-data\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.197338 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" path="/var/lib/kubelet/pods/7bbbb56d-564a-45df-b50c-7bc4ba290812/volumes" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290048 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290155 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92rb8\" (UniqueName: \"kubernetes.io/projected/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-kube-api-access-92rb8\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290213 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-log-httpd\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290301 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290345 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-run-httpd\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290398 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-config-data\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290452 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-scripts\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.292617 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-log-httpd\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.292887 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-run-httpd\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.295744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.296219 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-config-data\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.297487 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-scripts\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.298314 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.309190 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92rb8\" (UniqueName: \"kubernetes.io/projected/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-kube-api-access-92rb8\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.407098 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.949951 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:50 crc kubenswrapper[4778]: W0318 09:24:50.954520 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1c91e10_5caf_4f06_89bb_c9dacc92ecef.slice/crio-f02f2bc76544bcc97a0d67474e80bad569ca14d09e55a75f25e0c054c4e8be41 WatchSource:0}: Error finding container f02f2bc76544bcc97a0d67474e80bad569ca14d09e55a75f25e0c054c4e8be41: Status 404 returned error can't find the container with id f02f2bc76544bcc97a0d67474e80bad569ca14d09e55a75f25e0c054c4e8be41 Mar 18 09:24:51 crc kubenswrapper[4778]: I0318 09:24:51.692474 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerStarted","Data":"f02f2bc76544bcc97a0d67474e80bad569ca14d09e55a75f25e0c054c4e8be41"} Mar 18 09:24:52 crc kubenswrapper[4778]: I0318 09:24:52.705761 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerStarted","Data":"5bf367bc7ca1760e31682d105fada5e12302e00c9e7370ed60b5dbbd5a0d181b"} Mar 18 09:24:52 crc kubenswrapper[4778]: I0318 09:24:52.706185 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerStarted","Data":"c9b5655db1df35344b5ba27e4f2b48e2b1e3b0ed8aae0f66ceb4cd01dacb6163"} Mar 18 09:24:56 crc kubenswrapper[4778]: I0318 09:24:56.744164 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" event={"ID":"8348daa3-112d-49f7-93d8-3649ebf10eee","Type":"ContainerStarted","Data":"aed5c2d54c93258cf5753b658cc8a1430cb39fb4faca41384d49dcc12f51df37"} Mar 18 09:24:56 crc kubenswrapper[4778]: I0318 09:24:56.772562 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" podStartSLOduration=1.707374648 podStartE2EDuration="8.772535808s" podCreationTimestamp="2026-03-18 09:24:48 +0000 UTC" firstStartedPulling="2026-03-18 09:24:49.48094113 +0000 UTC m=+1356.055685980" lastFinishedPulling="2026-03-18 09:24:56.5461023 +0000 UTC m=+1363.120847140" observedRunningTime="2026-03-18 09:24:56.763550923 +0000 UTC m=+1363.338295803" watchObservedRunningTime="2026-03-18 09:24:56.772535808 +0000 UTC m=+1363.347280678" Mar 18 09:24:57 crc kubenswrapper[4778]: I0318 09:24:57.774113 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerStarted","Data":"5aea936acc296e6d46f72dc7943fff348d82ba9d5c552a78ef0902e78f275280"} Mar 18 09:24:58 crc kubenswrapper[4778]: I0318 09:24:58.788887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerStarted","Data":"7e71673254bb31ec9a40bba9a74a5d569998bf03658635f51dde9500e1435648"} Mar 18 09:24:58 crc kubenswrapper[4778]: I0318 09:24:58.789372 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:24:58 crc kubenswrapper[4778]: I0318 09:24:58.825798 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.543137004 podStartE2EDuration="8.825773666s" podCreationTimestamp="2026-03-18 09:24:50 +0000 UTC" firstStartedPulling="2026-03-18 09:24:50.958376055 +0000 UTC m=+1357.533120915" lastFinishedPulling="2026-03-18 09:24:58.241012737 +0000 UTC m=+1364.815757577" observedRunningTime="2026-03-18 09:24:58.813185503 +0000 UTC m=+1365.387930383" watchObservedRunningTime="2026-03-18 09:24:58.825773666 +0000 UTC m=+1365.400518546" Mar 18 09:25:09 crc kubenswrapper[4778]: I0318 09:25:09.072265 4778 trace.go:236] Trace[1554972338]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-8ch6j" (18-Mar-2026 09:25:07.742) (total time: 1329ms): Mar 18 09:25:09 crc kubenswrapper[4778]: Trace[1554972338]: [1.329360951s] [1.329360951s] END Mar 18 09:25:10 crc kubenswrapper[4778]: I0318 09:25:10.082232 4778 generic.go:334] "Generic (PLEG): container finished" podID="8348daa3-112d-49f7-93d8-3649ebf10eee" containerID="aed5c2d54c93258cf5753b658cc8a1430cb39fb4faca41384d49dcc12f51df37" exitCode=0 Mar 18 09:25:10 crc kubenswrapper[4778]: I0318 09:25:10.082288 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" event={"ID":"8348daa3-112d-49f7-93d8-3649ebf10eee","Type":"ContainerDied","Data":"aed5c2d54c93258cf5753b658cc8a1430cb39fb4faca41384d49dcc12f51df37"} Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.471424 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.577467 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-combined-ca-bundle\") pod \"8348daa3-112d-49f7-93d8-3649ebf10eee\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.577621 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvzwc\" (UniqueName: \"kubernetes.io/projected/8348daa3-112d-49f7-93d8-3649ebf10eee-kube-api-access-cvzwc\") pod \"8348daa3-112d-49f7-93d8-3649ebf10eee\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.577704 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-config-data\") pod \"8348daa3-112d-49f7-93d8-3649ebf10eee\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.577885 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-scripts\") pod \"8348daa3-112d-49f7-93d8-3649ebf10eee\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.583618 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-scripts" (OuterVolumeSpecName: "scripts") pod "8348daa3-112d-49f7-93d8-3649ebf10eee" (UID: "8348daa3-112d-49f7-93d8-3649ebf10eee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.585438 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8348daa3-112d-49f7-93d8-3649ebf10eee-kube-api-access-cvzwc" (OuterVolumeSpecName: "kube-api-access-cvzwc") pod "8348daa3-112d-49f7-93d8-3649ebf10eee" (UID: "8348daa3-112d-49f7-93d8-3649ebf10eee"). InnerVolumeSpecName "kube-api-access-cvzwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.611708 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8348daa3-112d-49f7-93d8-3649ebf10eee" (UID: "8348daa3-112d-49f7-93d8-3649ebf10eee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.617507 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-config-data" (OuterVolumeSpecName: "config-data") pod "8348daa3-112d-49f7-93d8-3649ebf10eee" (UID: "8348daa3-112d-49f7-93d8-3649ebf10eee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.681219 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.681306 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.681337 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvzwc\" (UniqueName: \"kubernetes.io/projected/8348daa3-112d-49f7-93d8-3649ebf10eee-kube-api-access-cvzwc\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.681358 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.120074 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" event={"ID":"8348daa3-112d-49f7-93d8-3649ebf10eee","Type":"ContainerDied","Data":"f5bd2fabe74afe68d7b89edc3d7934e5f231bf730e99188b91dab652b2d5599f"} Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.120892 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5bd2fabe74afe68d7b89edc3d7934e5f231bf730e99188b91dab652b2d5599f" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.121090 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.274672 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 09:25:12 crc kubenswrapper[4778]: E0318 09:25:12.275300 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8348daa3-112d-49f7-93d8-3649ebf10eee" containerName="nova-cell0-conductor-db-sync" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.275392 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8348daa3-112d-49f7-93d8-3649ebf10eee" containerName="nova-cell0-conductor-db-sync" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.275653 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8348daa3-112d-49f7-93d8-3649ebf10eee" containerName="nova-cell0-conductor-db-sync" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.276276 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.287227 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.289808 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.289884 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pgv4m" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.402507 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc908a0-dc90-4df9-869c-5c0820cac423-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.402626 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt5w5\" (UniqueName: \"kubernetes.io/projected/3fc908a0-dc90-4df9-869c-5c0820cac423-kube-api-access-gt5w5\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.402935 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc908a0-dc90-4df9-869c-5c0820cac423-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.504355 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc908a0-dc90-4df9-869c-5c0820cac423-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.504405 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt5w5\" (UniqueName: \"kubernetes.io/projected/3fc908a0-dc90-4df9-869c-5c0820cac423-kube-api-access-gt5w5\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.504510 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc908a0-dc90-4df9-869c-5c0820cac423-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.508744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc908a0-dc90-4df9-869c-5c0820cac423-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.511133 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc908a0-dc90-4df9-869c-5c0820cac423-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.530097 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt5w5\" (UniqueName: \"kubernetes.io/projected/3fc908a0-dc90-4df9-869c-5c0820cac423-kube-api-access-gt5w5\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.602700 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:13 crc kubenswrapper[4778]: I0318 09:25:13.134230 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 09:25:13 crc kubenswrapper[4778]: W0318 09:25:13.136781 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fc908a0_dc90_4df9_869c_5c0820cac423.slice/crio-eef00d170f457911eaa2258c8ba86e0253ab6e40474d31b674f9d76ea4b4194a WatchSource:0}: Error finding container eef00d170f457911eaa2258c8ba86e0253ab6e40474d31b674f9d76ea4b4194a: Status 404 returned error can't find the container with id eef00d170f457911eaa2258c8ba86e0253ab6e40474d31b674f9d76ea4b4194a Mar 18 09:25:14 crc kubenswrapper[4778]: I0318 09:25:14.147734 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3fc908a0-dc90-4df9-869c-5c0820cac423","Type":"ContainerStarted","Data":"d54334f64ea5e42752a6984d065a46f7c7ff4fcec00760b8bfa17fb3f3750ce7"} Mar 18 09:25:14 crc kubenswrapper[4778]: I0318 09:25:14.148360 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3fc908a0-dc90-4df9-869c-5c0820cac423","Type":"ContainerStarted","Data":"eef00d170f457911eaa2258c8ba86e0253ab6e40474d31b674f9d76ea4b4194a"} Mar 18 09:25:14 crc kubenswrapper[4778]: I0318 09:25:14.148712 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:14 crc kubenswrapper[4778]: I0318 09:25:14.198067 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.198010973 podStartE2EDuration="2.198010973s" podCreationTimestamp="2026-03-18 09:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:14.177904046 +0000 UTC m=+1380.752648966" watchObservedRunningTime="2026-03-18 09:25:14.198010973 +0000 UTC m=+1380.772755853" Mar 18 09:25:20 crc kubenswrapper[4778]: I0318 09:25:20.413442 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 09:25:22 crc kubenswrapper[4778]: I0318 09:25:22.630268 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.224745 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mjs29"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.226657 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.229808 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.229937 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.243688 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mjs29"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.327678 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-config-data\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.327730 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-scripts\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.327759 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.327827 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrl8g\" (UniqueName: \"kubernetes.io/projected/b89439e3-a138-4aa8-98a4-2e23ce3819e0-kube-api-access-wrl8g\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.413249 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.415878 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.430799 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.430893 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrl8g\" (UniqueName: \"kubernetes.io/projected/b89439e3-a138-4aa8-98a4-2e23ce3819e0-kube-api-access-wrl8g\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.430982 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-config-data\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.431014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-scripts\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.433662 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.437883 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-scripts\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.441838 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-config-data\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.462469 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.464010 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.474776 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrl8g\" (UniqueName: \"kubernetes.io/projected/b89439e3-a138-4aa8-98a4-2e23ce3819e0-kube-api-access-wrl8g\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.537537 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/827e3d5b-c1fe-4634-b819-4d816911b71e-logs\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.537635 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.537678 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rlzt\" (UniqueName: \"kubernetes.io/projected/827e3d5b-c1fe-4634-b819-4d816911b71e-kube-api-access-7rlzt\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.537727 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-config-data\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.553878 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.599406 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.601938 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.605755 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.614097 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.655526 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.655584 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rlzt\" (UniqueName: \"kubernetes.io/projected/827e3d5b-c1fe-4634-b819-4d816911b71e-kube-api-access-7rlzt\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.655626 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-config-data\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.655691 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/827e3d5b-c1fe-4634-b819-4d816911b71e-logs\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.656095 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/827e3d5b-c1fe-4634-b819-4d816911b71e-logs\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.663286 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.664752 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.669279 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.676083 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-f44bz"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.677480 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.678602 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.709176 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-config-data\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.718849 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.760990 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-config\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761036 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761068 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-config-data\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761106 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9vgw\" (UniqueName: \"kubernetes.io/projected/ecb86d82-de0e-474c-9942-a8dff1f8739b-kube-api-access-s9vgw\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761146 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-config-data\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761179 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761231 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dfe71cf-9dde-4056-a41b-a36c1773ace5-logs\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761251 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vww5\" (UniqueName: \"kubernetes.io/projected/8dfe71cf-9dde-4056-a41b-a36c1773ace5-kube-api-access-8vww5\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761289 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761316 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sc9l\" (UniqueName: \"kubernetes.io/projected/326b0319-1314-4e0c-9e38-7f0358087107-kube-api-access-7sc9l\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761334 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761368 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.796466 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rlzt\" (UniqueName: \"kubernetes.io/projected/827e3d5b-c1fe-4634-b819-4d816911b71e-kube-api-access-7rlzt\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.812034 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-f44bz"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.843274 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.854153 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864184 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9vgw\" (UniqueName: \"kubernetes.io/projected/ecb86d82-de0e-474c-9942-a8dff1f8739b-kube-api-access-s9vgw\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864247 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-config-data\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864288 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864325 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dfe71cf-9dde-4056-a41b-a36c1773ace5-logs\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864343 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vww5\" (UniqueName: \"kubernetes.io/projected/8dfe71cf-9dde-4056-a41b-a36c1773ace5-kube-api-access-8vww5\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864367 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sc9l\" (UniqueName: \"kubernetes.io/projected/326b0319-1314-4e0c-9e38-7f0358087107-kube-api-access-7sc9l\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864418 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864434 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864486 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-config\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864511 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-config-data\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.866293 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.866540 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dfe71cf-9dde-4056-a41b-a36c1773ace5-logs\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.877527 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.877789 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.879051 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-config\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.880907 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.884807 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-config-data\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.890421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.891349 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.896015 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.900186 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-config-data\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.910880 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9vgw\" (UniqueName: \"kubernetes.io/projected/ecb86d82-de0e-474c-9942-a8dff1f8739b-kube-api-access-s9vgw\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.925864 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sc9l\" (UniqueName: \"kubernetes.io/projected/326b0319-1314-4e0c-9e38-7f0358087107-kube-api-access-7sc9l\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.926053 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vww5\" (UniqueName: \"kubernetes.io/projected/8dfe71cf-9dde-4056-a41b-a36c1773ace5-kube-api-access-8vww5\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.967400 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bdst\" (UniqueName: \"kubernetes.io/projected/81cc74f1-64bc-448f-9654-352927efbb4c-kube-api-access-8bdst\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.967546 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.967651 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.984889 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.069579 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.069710 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.069776 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bdst\" (UniqueName: \"kubernetes.io/projected/81cc74f1-64bc-448f-9654-352927efbb4c-kube-api-access-8bdst\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.082996 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.090501 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.091567 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bdst\" (UniqueName: \"kubernetes.io/projected/81cc74f1-64bc-448f-9654-352927efbb4c-kube-api-access-8bdst\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.138513 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.151287 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.188682 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.188964 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.439425 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mjs29"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.606481 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.619422 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jbjb9"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.621003 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.623572 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.626618 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.646467 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jbjb9"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.705089 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.784075 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.788419 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.788469 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-config-data\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.788508 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldpq2\" (UniqueName: \"kubernetes.io/projected/e85d64a6-99af-4b66-9a60-cd6a046af840-kube-api-access-ldpq2\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.788809 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-scripts\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.791524 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-f44bz"] Mar 18 09:25:24 crc kubenswrapper[4778]: W0318 09:25:24.796324 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecb86d82_de0e_474c_9942_a8dff1f8739b.slice/crio-a9b184c2a1d038cecffbf8235c70616929fdbf84785e09acda62d2b9837c969e WatchSource:0}: Error finding container a9b184c2a1d038cecffbf8235c70616929fdbf84785e09acda62d2b9837c969e: Status 404 returned error can't find the container with id a9b184c2a1d038cecffbf8235c70616929fdbf84785e09acda62d2b9837c969e Mar 18 09:25:24 crc kubenswrapper[4778]: W0318 09:25:24.796645 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod326b0319_1314_4e0c_9e38_7f0358087107.slice/crio-c61d899fafa112db54c6c9a253fee0265e8eb8a1263e04ecef015a4a97256ebe WatchSource:0}: Error finding container c61d899fafa112db54c6c9a253fee0265e8eb8a1263e04ecef015a4a97256ebe: Status 404 returned error can't find the container with id c61d899fafa112db54c6c9a253fee0265e8eb8a1263e04ecef015a4a97256ebe Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.863457 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:25:24 crc kubenswrapper[4778]: W0318 09:25:24.863961 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81cc74f1_64bc_448f_9654_352927efbb4c.slice/crio-faecd33981e31af7bf2e1b8e7eefd8b08f1e463cade44ffd3aabfca3aa9b5d91 WatchSource:0}: Error finding container faecd33981e31af7bf2e1b8e7eefd8b08f1e463cade44ffd3aabfca3aa9b5d91: Status 404 returned error can't find the container with id faecd33981e31af7bf2e1b8e7eefd8b08f1e463cade44ffd3aabfca3aa9b5d91 Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.890010 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.890058 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-config-data\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.890095 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldpq2\" (UniqueName: \"kubernetes.io/projected/e85d64a6-99af-4b66-9a60-cd6a046af840-kube-api-access-ldpq2\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.890170 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-scripts\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.894779 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-config-data\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.894809 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.895656 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-scripts\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.907433 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldpq2\" (UniqueName: \"kubernetes.io/projected/e85d64a6-99af-4b66-9a60-cd6a046af840-kube-api-access-ldpq2\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.001686 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:25 crc kubenswrapper[4778]: E0318 09:25:25.123769 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecb86d82_de0e_474c_9942_a8dff1f8739b.slice/crio-conmon-fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6.scope\": RecentStats: unable to find data in memory cache]" Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.259283 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8dfe71cf-9dde-4056-a41b-a36c1773ace5","Type":"ContainerStarted","Data":"b1ef1193696aa80f8e8dae0aeb903c8f66ca8c8c56434acbb7fe2f05ed91d084"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.264499 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mjs29" event={"ID":"b89439e3-a138-4aa8-98a4-2e23ce3819e0","Type":"ContainerStarted","Data":"462bde6149a0c02e0a81e9d8cf7097470bbd3546789cdc6d2d61c3177437187e"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.264553 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mjs29" event={"ID":"b89439e3-a138-4aa8-98a4-2e23ce3819e0","Type":"ContainerStarted","Data":"829ccaa44c975124734c6e174a8e09a9966fa6e5717d8a5911b2a376e783ecf2"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.267475 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"827e3d5b-c1fe-4634-b819-4d816911b71e","Type":"ContainerStarted","Data":"af02077ff8be3a9cb17ac9dfdbd7efd69c18c248340c749dae5c8e266151e304"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.271511 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81cc74f1-64bc-448f-9654-352927efbb4c","Type":"ContainerStarted","Data":"faecd33981e31af7bf2e1b8e7eefd8b08f1e463cade44ffd3aabfca3aa9b5d91"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.273426 4778 generic.go:334] "Generic (PLEG): container finished" podID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerID="fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6" exitCode=0 Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.273570 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" event={"ID":"ecb86d82-de0e-474c-9942-a8dff1f8739b","Type":"ContainerDied","Data":"fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.273609 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" event={"ID":"ecb86d82-de0e-474c-9942-a8dff1f8739b","Type":"ContainerStarted","Data":"a9b184c2a1d038cecffbf8235c70616929fdbf84785e09acda62d2b9837c969e"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.274692 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"326b0319-1314-4e0c-9e38-7f0358087107","Type":"ContainerStarted","Data":"c61d899fafa112db54c6c9a253fee0265e8eb8a1263e04ecef015a4a97256ebe"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.291698 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mjs29" podStartSLOduration=2.291673675 podStartE2EDuration="2.291673675s" podCreationTimestamp="2026-03-18 09:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:25.280708856 +0000 UTC m=+1391.855453686" watchObservedRunningTime="2026-03-18 09:25:25.291673675 +0000 UTC m=+1391.866418505" Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.514396 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jbjb9"] Mar 18 09:25:25 crc kubenswrapper[4778]: W0318 09:25:25.518050 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode85d64a6_99af_4b66_9a60_cd6a046af840.slice/crio-9118c3ac4661b3103a217848a93eccdeeeb28befd44dfdb72aa05904c8bd2a92 WatchSource:0}: Error finding container 9118c3ac4661b3103a217848a93eccdeeeb28befd44dfdb72aa05904c8bd2a92: Status 404 returned error can't find the container with id 9118c3ac4661b3103a217848a93eccdeeeb28befd44dfdb72aa05904c8bd2a92 Mar 18 09:25:26 crc kubenswrapper[4778]: I0318 09:25:26.288786 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" event={"ID":"e85d64a6-99af-4b66-9a60-cd6a046af840","Type":"ContainerStarted","Data":"973e9f8f665d67a226625f9e044e9c18b31cbfecdc6ee8dcf02562081d63ced4"} Mar 18 09:25:26 crc kubenswrapper[4778]: I0318 09:25:26.289290 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" event={"ID":"e85d64a6-99af-4b66-9a60-cd6a046af840","Type":"ContainerStarted","Data":"9118c3ac4661b3103a217848a93eccdeeeb28befd44dfdb72aa05904c8bd2a92"} Mar 18 09:25:26 crc kubenswrapper[4778]: I0318 09:25:26.293304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" event={"ID":"ecb86d82-de0e-474c-9942-a8dff1f8739b","Type":"ContainerStarted","Data":"e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e"} Mar 18 09:25:26 crc kubenswrapper[4778]: I0318 09:25:26.316609 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" podStartSLOduration=2.316583503 podStartE2EDuration="2.316583503s" podCreationTimestamp="2026-03-18 09:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:26.302585782 +0000 UTC m=+1392.877330622" watchObservedRunningTime="2026-03-18 09:25:26.316583503 +0000 UTC m=+1392.891328343" Mar 18 09:25:26 crc kubenswrapper[4778]: I0318 09:25:26.330348 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" podStartSLOduration=3.330329868 podStartE2EDuration="3.330329868s" podCreationTimestamp="2026-03-18 09:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:26.326203766 +0000 UTC m=+1392.900948626" watchObservedRunningTime="2026-03-18 09:25:26.330329868 +0000 UTC m=+1392.905074708" Mar 18 09:25:27 crc kubenswrapper[4778]: I0318 09:25:27.301711 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.203630 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.212370 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.314271 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"827e3d5b-c1fe-4634-b819-4d816911b71e","Type":"ContainerStarted","Data":"3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1"} Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.316139 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"326b0319-1314-4e0c-9e38-7f0358087107","Type":"ContainerStarted","Data":"4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc"} Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.318288 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8dfe71cf-9dde-4056-a41b-a36c1773ace5","Type":"ContainerStarted","Data":"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8"} Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.343754 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.100716173 podStartE2EDuration="5.343730991s" podCreationTimestamp="2026-03-18 09:25:23 +0000 UTC" firstStartedPulling="2026-03-18 09:25:24.802799739 +0000 UTC m=+1391.377544579" lastFinishedPulling="2026-03-18 09:25:27.045814527 +0000 UTC m=+1393.620559397" observedRunningTime="2026-03-18 09:25:28.338235451 +0000 UTC m=+1394.912980321" watchObservedRunningTime="2026-03-18 09:25:28.343730991 +0000 UTC m=+1394.918475841" Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.414963 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.415235 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="45babbce-b5d2-4ad5-8bc2-a5047e777e8d" containerName="kube-state-metrics" containerID="cri-o://3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed" gracePeriod=30 Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.928260 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.991915 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xklgw\" (UniqueName: \"kubernetes.io/projected/45babbce-b5d2-4ad5-8bc2-a5047e777e8d-kube-api-access-xklgw\") pod \"45babbce-b5d2-4ad5-8bc2-a5047e777e8d\" (UID: \"45babbce-b5d2-4ad5-8bc2-a5047e777e8d\") " Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.000279 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45babbce-b5d2-4ad5-8bc2-a5047e777e8d-kube-api-access-xklgw" (OuterVolumeSpecName: "kube-api-access-xklgw") pod "45babbce-b5d2-4ad5-8bc2-a5047e777e8d" (UID: "45babbce-b5d2-4ad5-8bc2-a5047e777e8d"). InnerVolumeSpecName "kube-api-access-xklgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.095251 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xklgw\" (UniqueName: \"kubernetes.io/projected/45babbce-b5d2-4ad5-8bc2-a5047e777e8d-kube-api-access-xklgw\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.152265 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.330387 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8dfe71cf-9dde-4056-a41b-a36c1773ace5","Type":"ContainerStarted","Data":"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d"} Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.330393 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-log" containerID="cri-o://83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.330628 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-metadata" containerID="cri-o://fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.339462 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"827e3d5b-c1fe-4634-b819-4d816911b71e","Type":"ContainerStarted","Data":"db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d"} Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.342422 4778 generic.go:334] "Generic (PLEG): container finished" podID="45babbce-b5d2-4ad5-8bc2-a5047e777e8d" containerID="3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed" exitCode=2 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.342499 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.342526 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"45babbce-b5d2-4ad5-8bc2-a5047e777e8d","Type":"ContainerDied","Data":"3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed"} Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.342591 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"45babbce-b5d2-4ad5-8bc2-a5047e777e8d","Type":"ContainerDied","Data":"aea881ea048223a1a79ed7ce2d76ae73c7092387d05c9eaeaf63d09ce8b8e125"} Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.342615 4778 scope.go:117] "RemoveContainer" containerID="3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.345453 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81cc74f1-64bc-448f-9654-352927efbb4c","Type":"ContainerStarted","Data":"e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5"} Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.345763 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="81cc74f1-64bc-448f-9654-352927efbb4c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.387458 4778 scope.go:117] "RemoveContainer" containerID="3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed" Mar 18 09:25:29 crc kubenswrapper[4778]: E0318 09:25:29.390042 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed\": container with ID starting with 3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed not found: ID does not exist" containerID="3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.390077 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed"} err="failed to get container status \"3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed\": rpc error: code = NotFound desc = could not find container \"3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed\": container with ID starting with 3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed not found: ID does not exist" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.392946 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.276761239 podStartE2EDuration="6.39293462s" podCreationTimestamp="2026-03-18 09:25:23 +0000 UTC" firstStartedPulling="2026-03-18 09:25:24.867134112 +0000 UTC m=+1391.441878952" lastFinishedPulling="2026-03-18 09:25:27.983307493 +0000 UTC m=+1394.558052333" observedRunningTime="2026-03-18 09:25:29.387155883 +0000 UTC m=+1395.961900723" watchObservedRunningTime="2026-03-18 09:25:29.39293462 +0000 UTC m=+1395.967679460" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.395296 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.05511777 podStartE2EDuration="6.395287464s" podCreationTimestamp="2026-03-18 09:25:23 +0000 UTC" firstStartedPulling="2026-03-18 09:25:24.707508293 +0000 UTC m=+1391.282253133" lastFinishedPulling="2026-03-18 09:25:27.047677957 +0000 UTC m=+1393.622422827" observedRunningTime="2026-03-18 09:25:29.363333654 +0000 UTC m=+1395.938078514" watchObservedRunningTime="2026-03-18 09:25:29.395287464 +0000 UTC m=+1395.970032304" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.419091 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.987817238 podStartE2EDuration="6.419068823s" podCreationTimestamp="2026-03-18 09:25:23 +0000 UTC" firstStartedPulling="2026-03-18 09:25:24.626461016 +0000 UTC m=+1391.201205856" lastFinishedPulling="2026-03-18 09:25:27.057712601 +0000 UTC m=+1393.632457441" observedRunningTime="2026-03-18 09:25:29.412980717 +0000 UTC m=+1395.987725557" watchObservedRunningTime="2026-03-18 09:25:29.419068823 +0000 UTC m=+1395.993813663" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.444297 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.448926 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.480886 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:25:29 crc kubenswrapper[4778]: E0318 09:25:29.481713 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45babbce-b5d2-4ad5-8bc2-a5047e777e8d" containerName="kube-state-metrics" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.481798 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="45babbce-b5d2-4ad5-8bc2-a5047e777e8d" containerName="kube-state-metrics" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.482155 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="45babbce-b5d2-4ad5-8bc2-a5047e777e8d" containerName="kube-state-metrics" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.483136 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.490869 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.491339 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.494686 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.505120 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6w8p\" (UniqueName: \"kubernetes.io/projected/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-api-access-r6w8p\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.505228 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.505272 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.505309 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.607361 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.607478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6w8p\" (UniqueName: \"kubernetes.io/projected/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-api-access-r6w8p\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.607524 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.607557 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.620515 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.622674 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.630460 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.633417 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6w8p\" (UniqueName: \"kubernetes.io/projected/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-api-access-r6w8p\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.811305 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.811836 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-central-agent" containerID="cri-o://c9b5655db1df35344b5ba27e4f2b48e2b1e3b0ed8aae0f66ceb4cd01dacb6163" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.812662 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="proxy-httpd" containerID="cri-o://7e71673254bb31ec9a40bba9a74a5d569998bf03658635f51dde9500e1435648" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.812751 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="sg-core" containerID="cri-o://5aea936acc296e6d46f72dc7943fff348d82ba9d5c552a78ef0902e78f275280" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.812813 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-notification-agent" containerID="cri-o://5bf367bc7ca1760e31682d105fada5e12302e00c9e7370ed60b5dbbd5a0d181b" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.823723 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.045310 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.134666 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-config-data\") pod \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.134716 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vww5\" (UniqueName: \"kubernetes.io/projected/8dfe71cf-9dde-4056-a41b-a36c1773ace5-kube-api-access-8vww5\") pod \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.134751 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-combined-ca-bundle\") pod \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.134832 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dfe71cf-9dde-4056-a41b-a36c1773ace5-logs\") pod \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.135658 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dfe71cf-9dde-4056-a41b-a36c1773ace5-logs" (OuterVolumeSpecName: "logs") pod "8dfe71cf-9dde-4056-a41b-a36c1773ace5" (UID: "8dfe71cf-9dde-4056-a41b-a36c1773ace5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.151626 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfe71cf-9dde-4056-a41b-a36c1773ace5-kube-api-access-8vww5" (OuterVolumeSpecName: "kube-api-access-8vww5") pod "8dfe71cf-9dde-4056-a41b-a36c1773ace5" (UID: "8dfe71cf-9dde-4056-a41b-a36c1773ace5"). InnerVolumeSpecName "kube-api-access-8vww5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.153705 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.153739 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.175403 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dfe71cf-9dde-4056-a41b-a36c1773ace5" (UID: "8dfe71cf-9dde-4056-a41b-a36c1773ace5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.198765 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-config-data" (OuterVolumeSpecName: "config-data") pod "8dfe71cf-9dde-4056-a41b-a36c1773ace5" (UID: "8dfe71cf-9dde-4056-a41b-a36c1773ace5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.203562 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45babbce-b5d2-4ad5-8bc2-a5047e777e8d" path="/var/lib/kubelet/pods/45babbce-b5d2-4ad5-8bc2-a5047e777e8d/volumes" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.237741 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.237779 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vww5\" (UniqueName: \"kubernetes.io/projected/8dfe71cf-9dde-4056-a41b-a36c1773ace5-kube-api-access-8vww5\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.237791 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.237800 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dfe71cf-9dde-4056-a41b-a36c1773ace5-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.357409 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerID="7e71673254bb31ec9a40bba9a74a5d569998bf03658635f51dde9500e1435648" exitCode=0 Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.357439 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerID="5aea936acc296e6d46f72dc7943fff348d82ba9d5c552a78ef0902e78f275280" exitCode=2 Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.357500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerDied","Data":"7e71673254bb31ec9a40bba9a74a5d569998bf03658635f51dde9500e1435648"} Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.357569 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerDied","Data":"5aea936acc296e6d46f72dc7943fff348d82ba9d5c552a78ef0902e78f275280"} Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359584 4778 generic.go:334] "Generic (PLEG): container finished" podID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerID="fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d" exitCode=0 Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359617 4778 generic.go:334] "Generic (PLEG): container finished" podID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerID="83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8" exitCode=143 Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359724 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8dfe71cf-9dde-4056-a41b-a36c1773ace5","Type":"ContainerDied","Data":"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d"} Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359800 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359803 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8dfe71cf-9dde-4056-a41b-a36c1773ace5","Type":"ContainerDied","Data":"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8"} Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359885 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8dfe71cf-9dde-4056-a41b-a36c1773ace5","Type":"ContainerDied","Data":"b1ef1193696aa80f8e8dae0aeb903c8f66ca8c8c56434acbb7fe2f05ed91d084"} Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359824 4778 scope.go:117] "RemoveContainer" containerID="fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.392868 4778 scope.go:117] "RemoveContainer" containerID="83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.396719 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.412500 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.429650 4778 scope.go:117] "RemoveContainer" containerID="fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d" Mar 18 09:25:30 crc kubenswrapper[4778]: E0318 09:25:30.432644 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d\": container with ID starting with fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d not found: ID does not exist" containerID="fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.432695 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d"} err="failed to get container status \"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d\": rpc error: code = NotFound desc = could not find container \"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d\": container with ID starting with fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d not found: ID does not exist" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.432726 4778 scope.go:117] "RemoveContainer" containerID="83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8" Mar 18 09:25:30 crc kubenswrapper[4778]: E0318 09:25:30.435757 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8\": container with ID starting with 83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8 not found: ID does not exist" containerID="83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.435798 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8"} err="failed to get container status \"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8\": rpc error: code = NotFound desc = could not find container \"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8\": container with ID starting with 83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8 not found: ID does not exist" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.435822 4778 scope.go:117] "RemoveContainer" containerID="fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.437435 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d"} err="failed to get container status \"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d\": rpc error: code = NotFound desc = could not find container \"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d\": container with ID starting with fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d not found: ID does not exist" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.437470 4778 scope.go:117] "RemoveContainer" containerID="83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.437702 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8"} err="failed to get container status \"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8\": rpc error: code = NotFound desc = could not find container \"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8\": container with ID starting with 83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8 not found: ID does not exist" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.438828 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:30 crc kubenswrapper[4778]: E0318 09:25:30.439280 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-log" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.439293 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-log" Mar 18 09:25:30 crc kubenswrapper[4778]: E0318 09:25:30.439313 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-metadata" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.439319 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-metadata" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.439491 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-metadata" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.439501 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-log" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.440519 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.453734 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.453957 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.455965 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:30 crc kubenswrapper[4778]: W0318 09:25:30.482027 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1663e1b0_f9b0_4168_9386_abf2c1b56b43.slice/crio-4161803091eb245c5d7ade5e9e9721aecc3f9745f54b9772f1e7645c5b3b9e05 WatchSource:0}: Error finding container 4161803091eb245c5d7ade5e9e9721aecc3f9745f54b9772f1e7645c5b3b9e05: Status 404 returned error can't find the container with id 4161803091eb245c5d7ade5e9e9721aecc3f9745f54b9772f1e7645c5b3b9e05 Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.492372 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.544008 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.544094 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.544222 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426016e7-8d14-4511-b963-528b9f54a8d1-logs\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.544313 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-config-data\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.544340 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54xfn\" (UniqueName: \"kubernetes.io/projected/426016e7-8d14-4511-b963-528b9f54a8d1-kube-api-access-54xfn\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.645665 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426016e7-8d14-4511-b963-528b9f54a8d1-logs\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.645769 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-config-data\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.645800 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54xfn\" (UniqueName: \"kubernetes.io/projected/426016e7-8d14-4511-b963-528b9f54a8d1-kube-api-access-54xfn\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.645841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.645869 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.646386 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426016e7-8d14-4511-b963-528b9f54a8d1-logs\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.652953 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-config-data\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.653877 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.654636 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.673401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54xfn\" (UniqueName: \"kubernetes.io/projected/426016e7-8d14-4511-b963-528b9f54a8d1-kube-api-access-54xfn\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.818315 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.367705 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.380293 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerID="c9b5655db1df35344b5ba27e4f2b48e2b1e3b0ed8aae0f66ceb4cd01dacb6163" exitCode=0 Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.380389 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerDied","Data":"c9b5655db1df35344b5ba27e4f2b48e2b1e3b0ed8aae0f66ceb4cd01dacb6163"} Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.381556 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"426016e7-8d14-4511-b963-528b9f54a8d1","Type":"ContainerStarted","Data":"d558c548a049774135a4b2809d2aa33c1a8e339abe7d548e3631a7e27ee2c8c4"} Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.383016 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1663e1b0-f9b0-4168-9386-abf2c1b56b43","Type":"ContainerStarted","Data":"083b98fb08e327ebd6216ea2b2ce175fbeb359e00c495b5782bf6e305325cab7"} Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.383063 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1663e1b0-f9b0-4168-9386-abf2c1b56b43","Type":"ContainerStarted","Data":"4161803091eb245c5d7ade5e9e9721aecc3f9745f54b9772f1e7645c5b3b9e05"} Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.383345 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.417133 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.0401437590000002 podStartE2EDuration="2.417105327s" podCreationTimestamp="2026-03-18 09:25:29 +0000 UTC" firstStartedPulling="2026-03-18 09:25:30.487327361 +0000 UTC m=+1397.062072201" lastFinishedPulling="2026-03-18 09:25:30.864288909 +0000 UTC m=+1397.439033769" observedRunningTime="2026-03-18 09:25:31.403124796 +0000 UTC m=+1397.977869636" watchObservedRunningTime="2026-03-18 09:25:31.417105327 +0000 UTC m=+1397.991850167" Mar 18 09:25:32 crc kubenswrapper[4778]: I0318 09:25:32.200915 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" path="/var/lib/kubelet/pods/8dfe71cf-9dde-4056-a41b-a36c1773ace5/volumes" Mar 18 09:25:32 crc kubenswrapper[4778]: I0318 09:25:32.394165 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"426016e7-8d14-4511-b963-528b9f54a8d1","Type":"ContainerStarted","Data":"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090"} Mar 18 09:25:32 crc kubenswrapper[4778]: I0318 09:25:32.394250 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"426016e7-8d14-4511-b963-528b9f54a8d1","Type":"ContainerStarted","Data":"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea"} Mar 18 09:25:32 crc kubenswrapper[4778]: I0318 09:25:32.397637 4778 generic.go:334] "Generic (PLEG): container finished" podID="b89439e3-a138-4aa8-98a4-2e23ce3819e0" containerID="462bde6149a0c02e0a81e9d8cf7097470bbd3546789cdc6d2d61c3177437187e" exitCode=0 Mar 18 09:25:32 crc kubenswrapper[4778]: I0318 09:25:32.398121 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mjs29" event={"ID":"b89439e3-a138-4aa8-98a4-2e23ce3819e0","Type":"ContainerDied","Data":"462bde6149a0c02e0a81e9d8cf7097470bbd3546789cdc6d2d61c3177437187e"} Mar 18 09:25:32 crc kubenswrapper[4778]: I0318 09:25:32.455716 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.455681008 podStartE2EDuration="2.455681008s" podCreationTimestamp="2026-03-18 09:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:32.440073642 +0000 UTC m=+1399.014818512" watchObservedRunningTime="2026-03-18 09:25:32.455681008 +0000 UTC m=+1399.030425858" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.416558 4778 generic.go:334] "Generic (PLEG): container finished" podID="e85d64a6-99af-4b66-9a60-cd6a046af840" containerID="973e9f8f665d67a226625f9e044e9c18b31cbfecdc6ee8dcf02562081d63ced4" exitCode=0 Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.416677 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" event={"ID":"e85d64a6-99af-4b66-9a60-cd6a046af840","Type":"ContainerDied","Data":"973e9f8f665d67a226625f9e044e9c18b31cbfecdc6ee8dcf02562081d63ced4"} Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.803747 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.820895 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrl8g\" (UniqueName: \"kubernetes.io/projected/b89439e3-a138-4aa8-98a4-2e23ce3819e0-kube-api-access-wrl8g\") pod \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.820953 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-combined-ca-bundle\") pod \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.821051 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-config-data\") pod \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.821083 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-scripts\") pod \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.833057 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-scripts" (OuterVolumeSpecName: "scripts") pod "b89439e3-a138-4aa8-98a4-2e23ce3819e0" (UID: "b89439e3-a138-4aa8-98a4-2e23ce3819e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.833496 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b89439e3-a138-4aa8-98a4-2e23ce3819e0-kube-api-access-wrl8g" (OuterVolumeSpecName: "kube-api-access-wrl8g") pod "b89439e3-a138-4aa8-98a4-2e23ce3819e0" (UID: "b89439e3-a138-4aa8-98a4-2e23ce3819e0"). InnerVolumeSpecName "kube-api-access-wrl8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.879010 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b89439e3-a138-4aa8-98a4-2e23ce3819e0" (UID: "b89439e3-a138-4aa8-98a4-2e23ce3819e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.879652 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-config-data" (OuterVolumeSpecName: "config-data") pod "b89439e3-a138-4aa8-98a4-2e23ce3819e0" (UID: "b89439e3-a138-4aa8-98a4-2e23ce3819e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.891892 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.892007 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.925920 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.925995 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.926030 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrl8g\" (UniqueName: \"kubernetes.io/projected/b89439e3-a138-4aa8-98a4-2e23ce3819e0-kube-api-access-wrl8g\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.926060 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.152760 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.201091 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.201144 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.205758 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.343097 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-9xln9"] Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.343403 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerName="dnsmasq-dns" containerID="cri-o://623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42" gracePeriod=10 Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.430533 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mjs29" event={"ID":"b89439e3-a138-4aa8-98a4-2e23ce3819e0","Type":"ContainerDied","Data":"829ccaa44c975124734c6e174a8e09a9966fa6e5717d8a5911b2a376e783ecf2"} Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.430633 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.431003 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="829ccaa44c975124734c6e174a8e09a9966fa6e5717d8a5911b2a376e783ecf2" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.520141 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.652854 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.677554 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.677830 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-log" containerID="cri-o://7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea" gracePeriod=30 Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.678351 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-metadata" containerID="cri-o://29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090" gracePeriod=30 Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.974493 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.975022 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.129614 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.154260 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.176946 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360025 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-config-data\") pod \"e85d64a6-99af-4b66-9a60-cd6a046af840\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360109 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-combined-ca-bundle\") pod \"e85d64a6-99af-4b66-9a60-cd6a046af840\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360255 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-dns-svc\") pod \"06a6d934-2f47-4628-a328-6ba9cefb8090\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360294 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-sb\") pod \"06a6d934-2f47-4628-a328-6ba9cefb8090\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-nb\") pod \"06a6d934-2f47-4628-a328-6ba9cefb8090\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360468 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-scripts\") pod \"e85d64a6-99af-4b66-9a60-cd6a046af840\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360622 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-config\") pod \"06a6d934-2f47-4628-a328-6ba9cefb8090\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360743 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hck6z\" (UniqueName: \"kubernetes.io/projected/06a6d934-2f47-4628-a328-6ba9cefb8090-kube-api-access-hck6z\") pod \"06a6d934-2f47-4628-a328-6ba9cefb8090\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360775 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldpq2\" (UniqueName: \"kubernetes.io/projected/e85d64a6-99af-4b66-9a60-cd6a046af840-kube-api-access-ldpq2\") pod \"e85d64a6-99af-4b66-9a60-cd6a046af840\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.396499 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-scripts" (OuterVolumeSpecName: "scripts") pod "e85d64a6-99af-4b66-9a60-cd6a046af840" (UID: "e85d64a6-99af-4b66-9a60-cd6a046af840"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.396534 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a6d934-2f47-4628-a328-6ba9cefb8090-kube-api-access-hck6z" (OuterVolumeSpecName: "kube-api-access-hck6z") pod "06a6d934-2f47-4628-a328-6ba9cefb8090" (UID: "06a6d934-2f47-4628-a328-6ba9cefb8090"). InnerVolumeSpecName "kube-api-access-hck6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.396607 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85d64a6-99af-4b66-9a60-cd6a046af840-kube-api-access-ldpq2" (OuterVolumeSpecName: "kube-api-access-ldpq2") pod "e85d64a6-99af-4b66-9a60-cd6a046af840" (UID: "e85d64a6-99af-4b66-9a60-cd6a046af840"). InnerVolumeSpecName "kube-api-access-ldpq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.416068 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-config-data" (OuterVolumeSpecName: "config-data") pod "e85d64a6-99af-4b66-9a60-cd6a046af840" (UID: "e85d64a6-99af-4b66-9a60-cd6a046af840"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.419559 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.432345 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e85d64a6-99af-4b66-9a60-cd6a046af840" (UID: "e85d64a6-99af-4b66-9a60-cd6a046af840"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.466183 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06a6d934-2f47-4628-a328-6ba9cefb8090" (UID: "06a6d934-2f47-4628-a328-6ba9cefb8090"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.467459 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.467493 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hck6z\" (UniqueName: \"kubernetes.io/projected/06a6d934-2f47-4628-a328-6ba9cefb8090-kube-api-access-hck6z\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.467506 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldpq2\" (UniqueName: \"kubernetes.io/projected/e85d64a6-99af-4b66-9a60-cd6a046af840-kube-api-access-ldpq2\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.467516 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.467525 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.467533 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.472894 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "06a6d934-2f47-4628-a328-6ba9cefb8090" (UID: "06a6d934-2f47-4628-a328-6ba9cefb8090"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.474489 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerID="5bf367bc7ca1760e31682d105fada5e12302e00c9e7370ed60b5dbbd5a0d181b" exitCode=0 Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.474570 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerDied","Data":"5bf367bc7ca1760e31682d105fada5e12302e00c9e7370ed60b5dbbd5a0d181b"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477399 4778 generic.go:334] "Generic (PLEG): container finished" podID="426016e7-8d14-4511-b963-528b9f54a8d1" containerID="29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090" exitCode=0 Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477422 4778 generic.go:334] "Generic (PLEG): container finished" podID="426016e7-8d14-4511-b963-528b9f54a8d1" containerID="7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea" exitCode=143 Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477464 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"426016e7-8d14-4511-b963-528b9f54a8d1","Type":"ContainerDied","Data":"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477482 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"426016e7-8d14-4511-b963-528b9f54a8d1","Type":"ContainerDied","Data":"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477495 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"426016e7-8d14-4511-b963-528b9f54a8d1","Type":"ContainerDied","Data":"d558c548a049774135a4b2809d2aa33c1a8e339abe7d548e3631a7e27ee2c8c4"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477516 4778 scope.go:117] "RemoveContainer" containerID="29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477670 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.479707 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-config" (OuterVolumeSpecName: "config") pod "06a6d934-2f47-4628-a328-6ba9cefb8090" (UID: "06a6d934-2f47-4628-a328-6ba9cefb8090"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.483639 4778 generic.go:334] "Generic (PLEG): container finished" podID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerID="623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42" exitCode=0 Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.483715 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" event={"ID":"06a6d934-2f47-4628-a328-6ba9cefb8090","Type":"ContainerDied","Data":"623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.483811 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" event={"ID":"06a6d934-2f47-4628-a328-6ba9cefb8090","Type":"ContainerDied","Data":"ff4d7228da7d0afd7376861dde18209e6f8e60848f76f48755825c7cbc2d2227"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.483920 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.486352 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-log" containerID="cri-o://3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1" gracePeriod=30 Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.486718 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.495119 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" event={"ID":"e85d64a6-99af-4b66-9a60-cd6a046af840","Type":"ContainerDied","Data":"9118c3ac4661b3103a217848a93eccdeeeb28befd44dfdb72aa05904c8bd2a92"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.495144 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9118c3ac4661b3103a217848a93eccdeeeb28befd44dfdb72aa05904c8bd2a92" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.495397 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-api" containerID="cri-o://db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d" gracePeriod=30 Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.507626 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "06a6d934-2f47-4628-a328-6ba9cefb8090" (UID: "06a6d934-2f47-4628-a328-6ba9cefb8090"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.533219 4778 scope.go:117] "RemoveContainer" containerID="7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564336 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.564764 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85d64a6-99af-4b66-9a60-cd6a046af840" containerName="nova-cell1-conductor-db-sync" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564778 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85d64a6-99af-4b66-9a60-cd6a046af840" containerName="nova-cell1-conductor-db-sync" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.564799 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-metadata" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564806 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-metadata" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.564822 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89439e3-a138-4aa8-98a4-2e23ce3819e0" containerName="nova-manage" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564828 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89439e3-a138-4aa8-98a4-2e23ce3819e0" containerName="nova-manage" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.564840 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerName="init" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564845 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerName="init" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.564859 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerName="dnsmasq-dns" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564865 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerName="dnsmasq-dns" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.564873 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-log" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564879 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-log" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565061 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-log" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565072 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-metadata" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565079 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85d64a6-99af-4b66-9a60-cd6a046af840" containerName="nova-cell1-conductor-db-sync" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565093 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerName="dnsmasq-dns" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565104 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89439e3-a138-4aa8-98a4-2e23ce3819e0" containerName="nova-manage" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565700 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565812 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.568919 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-config-data\") pod \"426016e7-8d14-4511-b963-528b9f54a8d1\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569012 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426016e7-8d14-4511-b963-528b9f54a8d1-logs\") pod \"426016e7-8d14-4511-b963-528b9f54a8d1\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569111 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-combined-ca-bundle\") pod \"426016e7-8d14-4511-b963-528b9f54a8d1\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569138 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569188 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-nova-metadata-tls-certs\") pod \"426016e7-8d14-4511-b963-528b9f54a8d1\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569298 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54xfn\" (UniqueName: \"kubernetes.io/projected/426016e7-8d14-4511-b963-528b9f54a8d1-kube-api-access-54xfn\") pod \"426016e7-8d14-4511-b963-528b9f54a8d1\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569720 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569738 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569748 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569714 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426016e7-8d14-4511-b963-528b9f54a8d1-logs" (OuterVolumeSpecName: "logs") pod "426016e7-8d14-4511-b963-528b9f54a8d1" (UID: "426016e7-8d14-4511-b963-528b9f54a8d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.582088 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/426016e7-8d14-4511-b963-528b9f54a8d1-kube-api-access-54xfn" (OuterVolumeSpecName: "kube-api-access-54xfn") pod "426016e7-8d14-4511-b963-528b9f54a8d1" (UID: "426016e7-8d14-4511-b963-528b9f54a8d1"). InnerVolumeSpecName "kube-api-access-54xfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.606389 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-config-data" (OuterVolumeSpecName: "config-data") pod "426016e7-8d14-4511-b963-528b9f54a8d1" (UID: "426016e7-8d14-4511-b963-528b9f54a8d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.626598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "426016e7-8d14-4511-b963-528b9f54a8d1" (UID: "426016e7-8d14-4511-b963-528b9f54a8d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.637966 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "426016e7-8d14-4511-b963-528b9f54a8d1" (UID: "426016e7-8d14-4511-b963-528b9f54a8d1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670719 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdgh6\" (UniqueName: \"kubernetes.io/projected/9ba2a389-4009-4dab-bc75-45a574e50bbc-kube-api-access-bdgh6\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670780 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba2a389-4009-4dab-bc75-45a574e50bbc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670887 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba2a389-4009-4dab-bc75-45a574e50bbc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670949 4778 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670967 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54xfn\" (UniqueName: \"kubernetes.io/projected/426016e7-8d14-4511-b963-528b9f54a8d1-kube-api-access-54xfn\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670977 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670986 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426016e7-8d14-4511-b963-528b9f54a8d1-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670995 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.671740 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.678904 4778 scope.go:117] "RemoveContainer" containerID="29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.679496 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090\": container with ID starting with 29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090 not found: ID does not exist" containerID="29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.679568 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090"} err="failed to get container status \"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090\": rpc error: code = NotFound desc = could not find container \"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090\": container with ID starting with 29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090 not found: ID does not exist" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.679608 4778 scope.go:117] "RemoveContainer" containerID="7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.679994 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea\": container with ID starting with 7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea not found: ID does not exist" containerID="7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.680043 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea"} err="failed to get container status \"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea\": rpc error: code = NotFound desc = could not find container \"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea\": container with ID starting with 7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea not found: ID does not exist" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.680074 4778 scope.go:117] "RemoveContainer" containerID="29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.681891 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090"} err="failed to get container status \"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090\": rpc error: code = NotFound desc = could not find container \"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090\": container with ID starting with 29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090 not found: ID does not exist" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.681918 4778 scope.go:117] "RemoveContainer" containerID="7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.682228 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea"} err="failed to get container status \"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea\": rpc error: code = NotFound desc = could not find container \"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea\": container with ID starting with 7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea not found: ID does not exist" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.682257 4778 scope.go:117] "RemoveContainer" containerID="623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.733075 4778 scope.go:117] "RemoveContainer" containerID="9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.773663 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-combined-ca-bundle\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.773761 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-sg-core-conf-yaml\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.773784 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-log-httpd\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.773854 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-run-httpd\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.773879 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92rb8\" (UniqueName: \"kubernetes.io/projected/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-kube-api-access-92rb8\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.773968 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-scripts\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.774011 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-config-data\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.774337 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba2a389-4009-4dab-bc75-45a574e50bbc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.774448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdgh6\" (UniqueName: \"kubernetes.io/projected/9ba2a389-4009-4dab-bc75-45a574e50bbc-kube-api-access-bdgh6\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.774494 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba2a389-4009-4dab-bc75-45a574e50bbc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.775529 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.775592 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.779832 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-kube-api-access-92rb8" (OuterVolumeSpecName: "kube-api-access-92rb8") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "kube-api-access-92rb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.779976 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-scripts" (OuterVolumeSpecName: "scripts") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.782584 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba2a389-4009-4dab-bc75-45a574e50bbc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.790969 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba2a389-4009-4dab-bc75-45a574e50bbc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.800244 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdgh6\" (UniqueName: \"kubernetes.io/projected/9ba2a389-4009-4dab-bc75-45a574e50bbc-kube-api-access-bdgh6\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.808715 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.870409 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.877723 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.877747 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.877783 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.877793 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.877802 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92rb8\" (UniqueName: \"kubernetes.io/projected/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-kube-api-access-92rb8\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.877812 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.924292 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-config-data" (OuterVolumeSpecName: "config-data") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.979888 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.984997 4778 scope.go:117] "RemoveContainer" containerID="623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.985423 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42\": container with ID starting with 623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42 not found: ID does not exist" containerID="623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.985491 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42"} err="failed to get container status \"623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42\": rpc error: code = NotFound desc = could not find container \"623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42\": container with ID starting with 623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42 not found: ID does not exist" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.985549 4778 scope.go:117] "RemoveContainer" containerID="9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.985876 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6\": container with ID starting with 9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6 not found: ID does not exist" containerID="9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.985905 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6"} err="failed to get container status \"9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6\": rpc error: code = NotFound desc = could not find container \"9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6\": container with ID starting with 9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6 not found: ID does not exist" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.990824 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.014912 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.055349 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.079834 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: E0318 09:25:36.080390 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="proxy-httpd" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080411 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="proxy-httpd" Mar 18 09:25:36 crc kubenswrapper[4778]: E0318 09:25:36.080432 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-notification-agent" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080441 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-notification-agent" Mar 18 09:25:36 crc kubenswrapper[4778]: E0318 09:25:36.080465 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-central-agent" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080476 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-central-agent" Mar 18 09:25:36 crc kubenswrapper[4778]: E0318 09:25:36.080488 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="sg-core" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080494 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="sg-core" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080690 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="proxy-httpd" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080709 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-notification-agent" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080715 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-central-agent" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080731 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="sg-core" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.082426 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.087926 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.089166 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.094267 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-9xln9"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.113674 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-9xln9"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.123876 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.187328 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ljtb\" (UniqueName: \"kubernetes.io/projected/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-kube-api-access-7ljtb\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.187366 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-config-data\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.187425 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.187450 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-logs\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.187494 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.202967 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" path="/var/lib/kubelet/pods/06a6d934-2f47-4628-a328-6ba9cefb8090/volumes" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.203720 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" path="/var/lib/kubelet/pods/426016e7-8d14-4511-b963-528b9f54a8d1/volumes" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.289510 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-config-data\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.289704 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.289761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-logs\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.289870 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.290012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ljtb\" (UniqueName: \"kubernetes.io/projected/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-kube-api-access-7ljtb\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.293664 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-logs\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.303423 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.304491 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-config-data\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.306616 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.314346 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ljtb\" (UniqueName: \"kubernetes.io/projected/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-kube-api-access-7ljtb\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.403746 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.504014 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.504509 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerDied","Data":"f02f2bc76544bcc97a0d67474e80bad569ca14d09e55a75f25e0c054c4e8be41"} Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.504553 4778 scope.go:117] "RemoveContainer" containerID="7e71673254bb31ec9a40bba9a74a5d569998bf03658635f51dde9500e1435648" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.541657 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.566601 4778 generic.go:334] "Generic (PLEG): container finished" podID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerID="3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1" exitCode=143 Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.566910 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="326b0319-1314-4e0c-9e38-7f0358087107" containerName="nova-scheduler-scheduler" containerID="cri-o://4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" gracePeriod=30 Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.567377 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"827e3d5b-c1fe-4634-b819-4d816911b71e","Type":"ContainerDied","Data":"3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1"} Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.594421 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.596951 4778 scope.go:117] "RemoveContainer" containerID="5aea936acc296e6d46f72dc7943fff348d82ba9d5c552a78ef0902e78f275280" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.649281 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.657760 4778 scope.go:117] "RemoveContainer" containerID="5bf367bc7ca1760e31682d105fada5e12302e00c9e7370ed60b5dbbd5a0d181b" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.663884 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.667275 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.669883 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.670967 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.671572 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.680317 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.702893 4778 scope.go:117] "RemoveContainer" containerID="c9b5655db1df35344b5ba27e4f2b48e2b1e3b0ed8aae0f66ceb4cd01dacb6163" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.747973 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748030 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml984\" (UniqueName: \"kubernetes.io/projected/f997c05f-82b3-4d82-859d-b02f458e355d-kube-api-access-ml984\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748055 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748137 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-config-data\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748156 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748243 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-log-httpd\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748312 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-scripts\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748333 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-run-httpd\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.849540 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-log-httpd\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850253 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-scripts\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850282 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-run-httpd\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850321 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850358 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml984\" (UniqueName: \"kubernetes.io/projected/f997c05f-82b3-4d82-859d-b02f458e355d-kube-api-access-ml984\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850383 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850427 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-config-data\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850853 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.852735 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-run-httpd\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.852826 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-log-httpd\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.858657 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.858669 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-scripts\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.861943 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.866523 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-config-data\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.875955 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml984\" (UniqueName: \"kubernetes.io/projected/f997c05f-82b3-4d82-859d-b02f458e355d-kube-api-access-ml984\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.877900 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.918748 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.002307 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.530832 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:25:37 crc kubenswrapper[4778]: W0318 09:25:37.531870 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf997c05f_82b3_4d82_859d_b02f458e355d.slice/crio-24881499c24b431bd6cb35e91dd94903665892ef9cb2374a0f2fab97787e21e9 WatchSource:0}: Error finding container 24881499c24b431bd6cb35e91dd94903665892ef9cb2374a0f2fab97787e21e9: Status 404 returned error can't find the container with id 24881499c24b431bd6cb35e91dd94903665892ef9cb2374a0f2fab97787e21e9 Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.578283 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerStarted","Data":"24881499c24b431bd6cb35e91dd94903665892ef9cb2374a0f2fab97787e21e9"} Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.579566 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9ba2a389-4009-4dab-bc75-45a574e50bbc","Type":"ContainerStarted","Data":"5cf0eb007f476e62593e4a0a2cc9a2ff0d13c9d31495cd46cfc6ebd1df79f55e"} Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.579606 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9ba2a389-4009-4dab-bc75-45a574e50bbc","Type":"ContainerStarted","Data":"da1dc5fd19dc3a3a37b90396161aa1a4b3b7f5930176884f084e9573161a7b03"} Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.580778 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.582395 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e","Type":"ContainerStarted","Data":"b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1"} Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.582437 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e","Type":"ContainerStarted","Data":"1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2"} Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.582451 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e","Type":"ContainerStarted","Data":"290b792dbc94b49540da6dec52821c0018cd2340a491c925285465d74334b24e"} Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.630466 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.630445654 podStartE2EDuration="2.630445654s" podCreationTimestamp="2026-03-18 09:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:37.605789932 +0000 UTC m=+1404.180534782" watchObservedRunningTime="2026-03-18 09:25:37.630445654 +0000 UTC m=+1404.205190494" Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.636057 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.636040036 podStartE2EDuration="2.636040036s" podCreationTimestamp="2026-03-18 09:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:37.63106885 +0000 UTC m=+1404.205813710" watchObservedRunningTime="2026-03-18 09:25:37.636040036 +0000 UTC m=+1404.210784866" Mar 18 09:25:38 crc kubenswrapper[4778]: I0318 09:25:38.205785 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" path="/var/lib/kubelet/pods/d1c91e10-5caf-4f06-89bb-c9dacc92ecef/volumes" Mar 18 09:25:38 crc kubenswrapper[4778]: I0318 09:25:38.611906 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerStarted","Data":"247ae67d75fa48d0cb6104fd6017725a72fc4e82ff55213dd5e0b293d531d756"} Mar 18 09:25:39 crc kubenswrapper[4778]: E0318 09:25:39.154555 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:25:39 crc kubenswrapper[4778]: E0318 09:25:39.159572 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:25:39 crc kubenswrapper[4778]: E0318 09:25:39.161306 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:25:39 crc kubenswrapper[4778]: E0318 09:25:39.161365 4778 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="326b0319-1314-4e0c-9e38-7f0358087107" containerName="nova-scheduler-scheduler" Mar 18 09:25:39 crc kubenswrapper[4778]: I0318 09:25:39.635767 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerStarted","Data":"dcd444b95c81fbfdea64f1fbc2bd6aead66b1e18e8638323c659ba0124ee650c"} Mar 18 09:25:39 crc kubenswrapper[4778]: I0318 09:25:39.852747 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 09:25:39 crc kubenswrapper[4778]: I0318 09:25:39.999601 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.026232 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-config-data\") pod \"326b0319-1314-4e0c-9e38-7f0358087107\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.027012 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sc9l\" (UniqueName: \"kubernetes.io/projected/326b0319-1314-4e0c-9e38-7f0358087107-kube-api-access-7sc9l\") pod \"326b0319-1314-4e0c-9e38-7f0358087107\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.027082 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-combined-ca-bundle\") pod \"326b0319-1314-4e0c-9e38-7f0358087107\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.033675 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/326b0319-1314-4e0c-9e38-7f0358087107-kube-api-access-7sc9l" (OuterVolumeSpecName: "kube-api-access-7sc9l") pod "326b0319-1314-4e0c-9e38-7f0358087107" (UID: "326b0319-1314-4e0c-9e38-7f0358087107"). InnerVolumeSpecName "kube-api-access-7sc9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.055362 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-config-data" (OuterVolumeSpecName: "config-data") pod "326b0319-1314-4e0c-9e38-7f0358087107" (UID: "326b0319-1314-4e0c-9e38-7f0358087107"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.074050 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "326b0319-1314-4e0c-9e38-7f0358087107" (UID: "326b0319-1314-4e0c-9e38-7f0358087107"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.128874 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.128906 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.128932 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sc9l\" (UniqueName: \"kubernetes.io/projected/326b0319-1314-4e0c-9e38-7f0358087107-kube-api-access-7sc9l\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.644864 4778 generic.go:334] "Generic (PLEG): container finished" podID="326b0319-1314-4e0c-9e38-7f0358087107" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" exitCode=0 Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.644895 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.644901 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"326b0319-1314-4e0c-9e38-7f0358087107","Type":"ContainerDied","Data":"4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc"} Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.644937 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"326b0319-1314-4e0c-9e38-7f0358087107","Type":"ContainerDied","Data":"c61d899fafa112db54c6c9a253fee0265e8eb8a1263e04ecef015a4a97256ebe"} Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.644956 4778 scope.go:117] "RemoveContainer" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.650191 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerStarted","Data":"a8ed834ba7a0f4ec475460749067a718c91fbf1875be425bac23470f710feed4"} Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.668481 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.670101 4778 scope.go:117] "RemoveContainer" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" Mar 18 09:25:40 crc kubenswrapper[4778]: E0318 09:25:40.670699 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc\": container with ID starting with 4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc not found: ID does not exist" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.670764 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc"} err="failed to get container status \"4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc\": rpc error: code = NotFound desc = could not find container \"4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc\": container with ID starting with 4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc not found: ID does not exist" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.693686 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.697649 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:40 crc kubenswrapper[4778]: E0318 09:25:40.698000 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="326b0319-1314-4e0c-9e38-7f0358087107" containerName="nova-scheduler-scheduler" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.698017 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="326b0319-1314-4e0c-9e38-7f0358087107" containerName="nova-scheduler-scheduler" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.698215 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="326b0319-1314-4e0c-9e38-7f0358087107" containerName="nova-scheduler-scheduler" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.698719 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.701115 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.706430 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.737121 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-config-data\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.737669 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.737702 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl9dz\" (UniqueName: \"kubernetes.io/projected/dd2218f5-0310-4e4c-8edc-d13c25707ea5-kube-api-access-vl9dz\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.839811 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.839877 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl9dz\" (UniqueName: \"kubernetes.io/projected/dd2218f5-0310-4e4c-8edc-d13c25707ea5-kube-api-access-vl9dz\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.839950 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-config-data\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.850704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-config-data\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.850992 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.858613 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl9dz\" (UniqueName: \"kubernetes.io/projected/dd2218f5-0310-4e4c-8edc-d13c25707ea5-kube-api-access-vl9dz\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.035434 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.413271 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.448673 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/827e3d5b-c1fe-4634-b819-4d816911b71e-logs\") pod \"827e3d5b-c1fe-4634-b819-4d816911b71e\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.448811 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-combined-ca-bundle\") pod \"827e3d5b-c1fe-4634-b819-4d816911b71e\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.448868 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rlzt\" (UniqueName: \"kubernetes.io/projected/827e3d5b-c1fe-4634-b819-4d816911b71e-kube-api-access-7rlzt\") pod \"827e3d5b-c1fe-4634-b819-4d816911b71e\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.448985 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-config-data\") pod \"827e3d5b-c1fe-4634-b819-4d816911b71e\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.451865 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/827e3d5b-c1fe-4634-b819-4d816911b71e-logs" (OuterVolumeSpecName: "logs") pod "827e3d5b-c1fe-4634-b819-4d816911b71e" (UID: "827e3d5b-c1fe-4634-b819-4d816911b71e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.455550 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/827e3d5b-c1fe-4634-b819-4d816911b71e-kube-api-access-7rlzt" (OuterVolumeSpecName: "kube-api-access-7rlzt") pod "827e3d5b-c1fe-4634-b819-4d816911b71e" (UID: "827e3d5b-c1fe-4634-b819-4d816911b71e"). InnerVolumeSpecName "kube-api-access-7rlzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.484855 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-config-data" (OuterVolumeSpecName: "config-data") pod "827e3d5b-c1fe-4634-b819-4d816911b71e" (UID: "827e3d5b-c1fe-4634-b819-4d816911b71e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.486420 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "827e3d5b-c1fe-4634-b819-4d816911b71e" (UID: "827e3d5b-c1fe-4634-b819-4d816911b71e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.551040 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.551557 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/827e3d5b-c1fe-4634-b819-4d816911b71e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.551571 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.551585 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rlzt\" (UniqueName: \"kubernetes.io/projected/827e3d5b-c1fe-4634-b819-4d816911b71e-kube-api-access-7rlzt\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:41 crc kubenswrapper[4778]: W0318 09:25:41.567409 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd2218f5_0310_4e4c_8edc_d13c25707ea5.slice/crio-b346fc3122696affb646e267887f25b4c0332b9b1cc46fa78a99f5366632ff21 WatchSource:0}: Error finding container b346fc3122696affb646e267887f25b4c0332b9b1cc46fa78a99f5366632ff21: Status 404 returned error can't find the container with id b346fc3122696affb646e267887f25b4c0332b9b1cc46fa78a99f5366632ff21 Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.568144 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.660752 4778 generic.go:334] "Generic (PLEG): container finished" podID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerID="db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d" exitCode=0 Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.660824 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"827e3d5b-c1fe-4634-b819-4d816911b71e","Type":"ContainerDied","Data":"db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d"} Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.660858 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"827e3d5b-c1fe-4634-b819-4d816911b71e","Type":"ContainerDied","Data":"af02077ff8be3a9cb17ac9dfdbd7efd69c18c248340c749dae5c8e266151e304"} Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.660880 4778 scope.go:117] "RemoveContainer" containerID="db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.660984 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.670012 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd2218f5-0310-4e4c-8edc-d13c25707ea5","Type":"ContainerStarted","Data":"b346fc3122696affb646e267887f25b4c0332b9b1cc46fa78a99f5366632ff21"} Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.712062 4778 scope.go:117] "RemoveContainer" containerID="3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.720718 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.737695 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.751223 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:41 crc kubenswrapper[4778]: E0318 09:25:41.751632 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-log" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.751646 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-log" Mar 18 09:25:41 crc kubenswrapper[4778]: E0318 09:25:41.751666 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-api" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.751672 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-api" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.751865 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-api" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.751888 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-log" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.752848 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.761544 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.762756 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.781593 4778 scope.go:117] "RemoveContainer" containerID="db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d" Mar 18 09:25:41 crc kubenswrapper[4778]: E0318 09:25:41.784636 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d\": container with ID starting with db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d not found: ID does not exist" containerID="db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.784684 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d"} err="failed to get container status \"db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d\": rpc error: code = NotFound desc = could not find container \"db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d\": container with ID starting with db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d not found: ID does not exist" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.784717 4778 scope.go:117] "RemoveContainer" containerID="3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1" Mar 18 09:25:41 crc kubenswrapper[4778]: E0318 09:25:41.786135 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1\": container with ID starting with 3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1 not found: ID does not exist" containerID="3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.786222 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1"} err="failed to get container status \"3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1\": rpc error: code = NotFound desc = could not find container \"3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1\": container with ID starting with 3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1 not found: ID does not exist" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.861145 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-logs\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.861217 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-config-data\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.861348 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.861404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsn6r\" (UniqueName: \"kubernetes.io/projected/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-kube-api-access-wsn6r\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.963346 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-logs\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.963640 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-config-data\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.963911 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.964020 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsn6r\" (UniqueName: \"kubernetes.io/projected/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-kube-api-access-wsn6r\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.963984 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-logs\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.967453 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-config-data\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.968419 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.985421 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsn6r\" (UniqueName: \"kubernetes.io/projected/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-kube-api-access-wsn6r\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.076584 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.208615 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="326b0319-1314-4e0c-9e38-7f0358087107" path="/var/lib/kubelet/pods/326b0319-1314-4e0c-9e38-7f0358087107/volumes" Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.209472 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" path="/var/lib/kubelet/pods/827e3d5b-c1fe-4634-b819-4d816911b71e/volumes" Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.558386 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.685370 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerStarted","Data":"53356b73b893522d6b8ce2a575d851c14fe0eb313a03d20cf218a2982a049325"} Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.685529 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.689216 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68","Type":"ContainerStarted","Data":"d464bbdf554e5888aa55da417c9822d9fe3e925deb0939352bfe349e89da555d"} Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.692228 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd2218f5-0310-4e4c-8edc-d13c25707ea5","Type":"ContainerStarted","Data":"cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413"} Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.719106 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.31534946 podStartE2EDuration="6.719081964s" podCreationTimestamp="2026-03-18 09:25:36 +0000 UTC" firstStartedPulling="2026-03-18 09:25:37.535803475 +0000 UTC m=+1404.110548315" lastFinishedPulling="2026-03-18 09:25:41.939535979 +0000 UTC m=+1408.514280819" observedRunningTime="2026-03-18 09:25:42.704855146 +0000 UTC m=+1409.279600006" watchObservedRunningTime="2026-03-18 09:25:42.719081964 +0000 UTC m=+1409.293826804" Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.738469 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.7384498710000003 podStartE2EDuration="2.738449871s" podCreationTimestamp="2026-03-18 09:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:42.73067249 +0000 UTC m=+1409.305417350" watchObservedRunningTime="2026-03-18 09:25:42.738449871 +0000 UTC m=+1409.313194711" Mar 18 09:25:43 crc kubenswrapper[4778]: I0318 09:25:43.710590 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68","Type":"ContainerStarted","Data":"416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f"} Mar 18 09:25:43 crc kubenswrapper[4778]: I0318 09:25:43.711540 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68","Type":"ContainerStarted","Data":"552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38"} Mar 18 09:25:43 crc kubenswrapper[4778]: I0318 09:25:43.752610 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.752581135 podStartE2EDuration="2.752581135s" podCreationTimestamp="2026-03-18 09:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:43.737106264 +0000 UTC m=+1410.311851134" watchObservedRunningTime="2026-03-18 09:25:43.752581135 +0000 UTC m=+1410.327326025" Mar 18 09:25:46 crc kubenswrapper[4778]: I0318 09:25:46.036346 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 09:25:46 crc kubenswrapper[4778]: I0318 09:25:46.043871 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:46 crc kubenswrapper[4778]: I0318 09:25:46.404404 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 09:25:46 crc kubenswrapper[4778]: I0318 09:25:46.404530 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 09:25:47 crc kubenswrapper[4778]: I0318 09:25:47.425346 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:25:47 crc kubenswrapper[4778]: I0318 09:25:47.425404 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:25:51 crc kubenswrapper[4778]: I0318 09:25:51.036437 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 09:25:51 crc kubenswrapper[4778]: I0318 09:25:51.076625 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 09:25:51 crc kubenswrapper[4778]: I0318 09:25:51.850774 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 09:25:52 crc kubenswrapper[4778]: I0318 09:25:52.077035 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:25:52 crc kubenswrapper[4778]: I0318 09:25:52.077090 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:25:53 crc kubenswrapper[4778]: I0318 09:25:53.158413 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 09:25:53 crc kubenswrapper[4778]: I0318 09:25:53.158845 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 09:25:54 crc kubenswrapper[4778]: I0318 09:25:54.404300 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 09:25:54 crc kubenswrapper[4778]: I0318 09:25:54.405133 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 09:25:56 crc kubenswrapper[4778]: I0318 09:25:56.415349 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 09:25:56 crc kubenswrapper[4778]: I0318 09:25:56.417681 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 09:25:56 crc kubenswrapper[4778]: I0318 09:25:56.426632 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 09:25:56 crc kubenswrapper[4778]: I0318 09:25:56.427895 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.850618 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.901837 4778 generic.go:334] "Generic (PLEG): container finished" podID="81cc74f1-64bc-448f-9654-352927efbb4c" containerID="e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5" exitCode=137 Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.901875 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.901877 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81cc74f1-64bc-448f-9654-352927efbb4c","Type":"ContainerDied","Data":"e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5"} Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.901986 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81cc74f1-64bc-448f-9654-352927efbb4c","Type":"ContainerDied","Data":"faecd33981e31af7bf2e1b8e7eefd8b08f1e463cade44ffd3aabfca3aa9b5d91"} Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.902006 4778 scope.go:117] "RemoveContainer" containerID="e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5" Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.935850 4778 scope.go:117] "RemoveContainer" containerID="e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5" Mar 18 09:25:59 crc kubenswrapper[4778]: E0318 09:25:59.936413 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5\": container with ID starting with e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5 not found: ID does not exist" containerID="e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5" Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.936457 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5"} err="failed to get container status \"e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5\": rpc error: code = NotFound desc = could not find container \"e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5\": container with ID starting with e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5 not found: ID does not exist" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.044919 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bdst\" (UniqueName: \"kubernetes.io/projected/81cc74f1-64bc-448f-9654-352927efbb4c-kube-api-access-8bdst\") pod \"81cc74f1-64bc-448f-9654-352927efbb4c\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.044966 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-config-data\") pod \"81cc74f1-64bc-448f-9654-352927efbb4c\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.045022 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-combined-ca-bundle\") pod \"81cc74f1-64bc-448f-9654-352927efbb4c\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.062556 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81cc74f1-64bc-448f-9654-352927efbb4c-kube-api-access-8bdst" (OuterVolumeSpecName: "kube-api-access-8bdst") pod "81cc74f1-64bc-448f-9654-352927efbb4c" (UID: "81cc74f1-64bc-448f-9654-352927efbb4c"). InnerVolumeSpecName "kube-api-access-8bdst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.077089 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.077161 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.083137 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-config-data" (OuterVolumeSpecName: "config-data") pod "81cc74f1-64bc-448f-9654-352927efbb4c" (UID: "81cc74f1-64bc-448f-9654-352927efbb4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.085490 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81cc74f1-64bc-448f-9654-352927efbb4c" (UID: "81cc74f1-64bc-448f-9654-352927efbb4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.147782 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.147831 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.148296 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bdst\" (UniqueName: \"kubernetes.io/projected/81cc74f1-64bc-448f-9654-352927efbb4c-kube-api-access-8bdst\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.148328 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.148338 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.148982 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563766-lqhxm"] Mar 18 09:26:00 crc kubenswrapper[4778]: E0318 09:26:00.149498 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81cc74f1-64bc-448f-9654-352927efbb4c" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.149521 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="81cc74f1-64bc-448f-9654-352927efbb4c" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.149791 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="81cc74f1-64bc-448f-9654-352927efbb4c" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.150576 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.153079 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.153182 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.153590 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.161849 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563766-lqhxm"] Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.247561 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.253368 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6djlr\" (UniqueName: \"kubernetes.io/projected/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa-kube-api-access-6djlr\") pod \"auto-csr-approver-29563766-lqhxm\" (UID: \"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa\") " pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.264366 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.302072 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.305218 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.308949 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.309145 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.309323 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.315901 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.355785 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6djlr\" (UniqueName: \"kubernetes.io/projected/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa-kube-api-access-6djlr\") pod \"auto-csr-approver-29563766-lqhxm\" (UID: \"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa\") " pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.378332 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6djlr\" (UniqueName: \"kubernetes.io/projected/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa-kube-api-access-6djlr\") pod \"auto-csr-approver-29563766-lqhxm\" (UID: \"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa\") " pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.457620 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.457935 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4k4t\" (UniqueName: \"kubernetes.io/projected/9549b39b-0fc5-4e89-b64a-de83c80735ed-kube-api-access-s4k4t\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.458022 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.458567 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.458616 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.471088 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.560220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.560282 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.560369 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.560394 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4k4t\" (UniqueName: \"kubernetes.io/projected/9549b39b-0fc5-4e89-b64a-de83c80735ed-kube-api-access-s4k4t\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.560467 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.567872 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.568937 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.595372 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.595401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.605458 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4k4t\" (UniqueName: \"kubernetes.io/projected/9549b39b-0fc5-4e89-b64a-de83c80735ed-kube-api-access-s4k4t\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.629712 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.783966 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563766-lqhxm"] Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.911034 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" event={"ID":"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa","Type":"ContainerStarted","Data":"6321a216ea221ca591bab933c4080da4dbbc3abbc34ba412c96f71c792e428d0"} Mar 18 09:26:01 crc kubenswrapper[4778]: I0318 09:26:01.126993 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:26:01 crc kubenswrapper[4778]: W0318 09:26:01.132444 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9549b39b_0fc5_4e89_b64a_de83c80735ed.slice/crio-f9b4d34396e0c739602ca88c4a374662e0ebdb4e08eb1919eb24219a92e01561 WatchSource:0}: Error finding container f9b4d34396e0c739602ca88c4a374662e0ebdb4e08eb1919eb24219a92e01561: Status 404 returned error can't find the container with id f9b4d34396e0c739602ca88c4a374662e0ebdb4e08eb1919eb24219a92e01561 Mar 18 09:26:01 crc kubenswrapper[4778]: I0318 09:26:01.931790 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9549b39b-0fc5-4e89-b64a-de83c80735ed","Type":"ContainerStarted","Data":"39b380408f9c28d831352c9df712478d58f8921a5d7729466c4061c2824af3fb"} Mar 18 09:26:01 crc kubenswrapper[4778]: I0318 09:26:01.932341 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9549b39b-0fc5-4e89-b64a-de83c80735ed","Type":"ContainerStarted","Data":"f9b4d34396e0c739602ca88c4a374662e0ebdb4e08eb1919eb24219a92e01561"} Mar 18 09:26:01 crc kubenswrapper[4778]: I0318 09:26:01.964790 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.964767641 podStartE2EDuration="1.964767641s" podCreationTimestamp="2026-03-18 09:26:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:01.95661504 +0000 UTC m=+1428.531359890" watchObservedRunningTime="2026-03-18 09:26:01.964767641 +0000 UTC m=+1428.539512481" Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.083110 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.085808 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.090951 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.200693 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81cc74f1-64bc-448f-9654-352927efbb4c" path="/var/lib/kubelet/pods/81cc74f1-64bc-448f-9654-352927efbb4c/volumes" Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.943299 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa" containerID="c24773f49ad71f17b93d5ac7609065bb82dac185ae78530f1dcf0ecca87ade20" exitCode=0 Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.943385 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" event={"ID":"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa","Type":"ContainerDied","Data":"c24773f49ad71f17b93d5ac7609065bb82dac185ae78530f1dcf0ecca87ade20"} Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.947948 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.152441 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-hzxbf"] Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.154313 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.164780 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-hzxbf"] Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.322207 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.322266 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.322318 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-config\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.322356 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.322686 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts5rq\" (UniqueName: \"kubernetes.io/projected/31b962b8-c7e1-495c-a52d-f4fb63e884ca-kube-api-access-ts5rq\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.424574 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.424621 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.424649 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-config\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.424679 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.424769 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts5rq\" (UniqueName: \"kubernetes.io/projected/31b962b8-c7e1-495c-a52d-f4fb63e884ca-kube-api-access-ts5rq\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.425660 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.425660 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-config\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.425768 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.425877 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.450989 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts5rq\" (UniqueName: \"kubernetes.io/projected/31b962b8-c7e1-495c-a52d-f4fb63e884ca-kube-api-access-ts5rq\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.483520 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.942678 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-hzxbf"] Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.339999 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.445183 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6djlr\" (UniqueName: \"kubernetes.io/projected/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa-kube-api-access-6djlr\") pod \"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa\" (UID: \"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa\") " Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.450580 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa-kube-api-access-6djlr" (OuterVolumeSpecName: "kube-api-access-6djlr") pod "ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa" (UID: "ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa"). InnerVolumeSpecName "kube-api-access-6djlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.547689 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6djlr\" (UniqueName: \"kubernetes.io/projected/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa-kube-api-access-6djlr\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.963128 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" event={"ID":"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa","Type":"ContainerDied","Data":"6321a216ea221ca591bab933c4080da4dbbc3abbc34ba412c96f71c792e428d0"} Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.963450 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6321a216ea221ca591bab933c4080da4dbbc3abbc34ba412c96f71c792e428d0" Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.963173 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.965041 4778 generic.go:334] "Generic (PLEG): container finished" podID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerID="bda48f5cd3722d61bee57d8aacf12b0b775a391dd2ad3e1c8999cc5bb624e15c" exitCode=0 Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.965259 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" event={"ID":"31b962b8-c7e1-495c-a52d-f4fb63e884ca","Type":"ContainerDied","Data":"bda48f5cd3722d61bee57d8aacf12b0b775a391dd2ad3e1c8999cc5bb624e15c"} Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.965321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" event={"ID":"31b962b8-c7e1-495c-a52d-f4fb63e884ca","Type":"ContainerStarted","Data":"f1b064a501bbcb63fd4dc86edbad3f121ced055deab0d7f76a0a08acbb649b7a"} Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.346858 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.347350 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-central-agent" containerID="cri-o://247ae67d75fa48d0cb6104fd6017725a72fc4e82ff55213dd5e0b293d531d756" gracePeriod=30 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.347515 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="proxy-httpd" containerID="cri-o://53356b73b893522d6b8ce2a575d851c14fe0eb313a03d20cf218a2982a049325" gracePeriod=30 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.347587 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="sg-core" containerID="cri-o://a8ed834ba7a0f4ec475460749067a718c91fbf1875be425bac23470f710feed4" gracePeriod=30 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.347646 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-notification-agent" containerID="cri-o://dcd444b95c81fbfdea64f1fbc2bd6aead66b1e18e8638323c659ba0124ee650c" gracePeriod=30 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.357816 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.426255 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563760-nvkp2"] Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.433027 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563760-nvkp2"] Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.534334 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.630975 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978574 4778 generic.go:334] "Generic (PLEG): container finished" podID="f997c05f-82b3-4d82-859d-b02f458e355d" containerID="53356b73b893522d6b8ce2a575d851c14fe0eb313a03d20cf218a2982a049325" exitCode=0 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978627 4778 generic.go:334] "Generic (PLEG): container finished" podID="f997c05f-82b3-4d82-859d-b02f458e355d" containerID="a8ed834ba7a0f4ec475460749067a718c91fbf1875be425bac23470f710feed4" exitCode=2 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978637 4778 generic.go:334] "Generic (PLEG): container finished" podID="f997c05f-82b3-4d82-859d-b02f458e355d" containerID="dcd444b95c81fbfdea64f1fbc2bd6aead66b1e18e8638323c659ba0124ee650c" exitCode=0 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978647 4778 generic.go:334] "Generic (PLEG): container finished" podID="f997c05f-82b3-4d82-859d-b02f458e355d" containerID="247ae67d75fa48d0cb6104fd6017725a72fc4e82ff55213dd5e0b293d531d756" exitCode=0 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978680 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerDied","Data":"53356b73b893522d6b8ce2a575d851c14fe0eb313a03d20cf218a2982a049325"} Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978743 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerDied","Data":"a8ed834ba7a0f4ec475460749067a718c91fbf1875be425bac23470f710feed4"} Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerDied","Data":"dcd444b95c81fbfdea64f1fbc2bd6aead66b1e18e8638323c659ba0124ee650c"} Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978774 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerDied","Data":"247ae67d75fa48d0cb6104fd6017725a72fc4e82ff55213dd5e0b293d531d756"} Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.983012 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-log" containerID="cri-o://552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38" gracePeriod=30 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.983632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" event={"ID":"31b962b8-c7e1-495c-a52d-f4fb63e884ca","Type":"ContainerStarted","Data":"76b7c61aa2a7be4fd8a7e4d8e9e84698c3d85d0b2b4f2c826eef7fac6ce91214"} Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.983680 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-api" containerID="cri-o://416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f" gracePeriod=30 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.983782 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.027998 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" podStartSLOduration=3.027965389 podStartE2EDuration="3.027965389s" podCreationTimestamp="2026-03-18 09:26:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:06.00966012 +0000 UTC m=+1432.584404970" watchObservedRunningTime="2026-03-18 09:26:06.027965389 +0000 UTC m=+1432.602710229" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.165883 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.199706 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc3bf93e-1b00-4852-b69b-0c8d701f56e3" path="/var/lib/kubelet/pods/bc3bf93e-1b00-4852-b69b-0c8d701f56e3/volumes" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.287715 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-combined-ca-bundle\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.287762 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-scripts\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.287873 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml984\" (UniqueName: \"kubernetes.io/projected/f997c05f-82b3-4d82-859d-b02f458e355d-kube-api-access-ml984\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.287897 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-log-httpd\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.287940 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-ceilometer-tls-certs\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.287997 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-config-data\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.288096 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-sg-core-conf-yaml\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.288124 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-run-httpd\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.288580 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.290306 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.290672 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.290741 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.302486 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-scripts" (OuterVolumeSpecName: "scripts") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.326538 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f997c05f-82b3-4d82-859d-b02f458e355d-kube-api-access-ml984" (OuterVolumeSpecName: "kube-api-access-ml984") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "kube-api-access-ml984". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.372631 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.393212 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.393779 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.393813 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml984\" (UniqueName: \"kubernetes.io/projected/f997c05f-82b3-4d82-859d-b02f458e355d-kube-api-access-ml984\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.421855 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.432787 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.458228 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-config-data" (OuterVolumeSpecName: "config-data") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.532735 4778 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.532784 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.532810 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.000235 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerDied","Data":"24881499c24b431bd6cb35e91dd94903665892ef9cb2374a0f2fab97787e21e9"} Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.000332 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.000340 4778 scope.go:117] "RemoveContainer" containerID="53356b73b893522d6b8ce2a575d851c14fe0eb313a03d20cf218a2982a049325" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.002500 4778 generic.go:334] "Generic (PLEG): container finished" podID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerID="552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38" exitCode=143 Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.003528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68","Type":"ContainerDied","Data":"552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38"} Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.053956 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.069698 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.079843 4778 scope.go:117] "RemoveContainer" containerID="a8ed834ba7a0f4ec475460749067a718c91fbf1875be425bac23470f710feed4" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.086383 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:07 crc kubenswrapper[4778]: E0318 09:26:07.087184 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-central-agent" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087229 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-central-agent" Mar 18 09:26:07 crc kubenswrapper[4778]: E0318 09:26:07.087241 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa" containerName="oc" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087307 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa" containerName="oc" Mar 18 09:26:07 crc kubenswrapper[4778]: E0318 09:26:07.087321 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="proxy-httpd" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087331 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="proxy-httpd" Mar 18 09:26:07 crc kubenswrapper[4778]: E0318 09:26:07.087341 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-notification-agent" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087347 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-notification-agent" Mar 18 09:26:07 crc kubenswrapper[4778]: E0318 09:26:07.087426 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="sg-core" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087580 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="sg-core" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087807 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-notification-agent" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087826 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa" containerName="oc" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087836 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="proxy-httpd" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087855 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-central-agent" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087861 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="sg-core" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.090782 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.093033 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.093316 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.094160 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.104268 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.134090 4778 scope.go:117] "RemoveContainer" containerID="dcd444b95c81fbfdea64f1fbc2bd6aead66b1e18e8638323c659ba0124ee650c" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.173047 4778 scope.go:117] "RemoveContainer" containerID="247ae67d75fa48d0cb6104fd6017725a72fc4e82ff55213dd5e0b293d531d756" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.245598 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns6zx\" (UniqueName: \"kubernetes.io/projected/52df8ed1-aa17-446a-a3b4-e641f38a409d-kube-api-access-ns6zx\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246182 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246261 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246283 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-scripts\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246335 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-config-data\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246382 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-log-httpd\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246402 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-run-httpd\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348082 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348163 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348184 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-scripts\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348278 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-config-data\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348338 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-log-httpd\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348358 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-run-httpd\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348399 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns6zx\" (UniqueName: \"kubernetes.io/projected/52df8ed1-aa17-446a-a3b4-e641f38a409d-kube-api-access-ns6zx\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348416 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.350345 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-run-httpd\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.351649 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-log-httpd\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.356261 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.356836 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-scripts\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.363330 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.364333 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-config-data\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.365095 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.371998 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns6zx\" (UniqueName: \"kubernetes.io/projected/52df8ed1-aa17-446a-a3b4-e641f38a409d-kube-api-access-ns6zx\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.426587 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.734572 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:08 crc kubenswrapper[4778]: I0318 09:26:08.134975 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:08 crc kubenswrapper[4778]: I0318 09:26:08.202120 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" path="/var/lib/kubelet/pods/f997c05f-82b3-4d82-859d-b02f458e355d/volumes" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.029493 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerStarted","Data":"d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5"} Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.030490 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerStarted","Data":"ba77b580ef06a1e2bd00469a95b5561856fad5eff21c63e51dda25d025a61698"} Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.535319 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.632071 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsn6r\" (UniqueName: \"kubernetes.io/projected/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-kube-api-access-wsn6r\") pod \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.632116 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-config-data\") pod \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.632212 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-logs\") pod \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.632238 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-combined-ca-bundle\") pod \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.632911 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-logs" (OuterVolumeSpecName: "logs") pod "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" (UID: "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.640109 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-kube-api-access-wsn6r" (OuterVolumeSpecName: "kube-api-access-wsn6r") pod "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" (UID: "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68"). InnerVolumeSpecName "kube-api-access-wsn6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.666441 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-config-data" (OuterVolumeSpecName: "config-data") pod "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" (UID: "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.672144 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" (UID: "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.734417 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsn6r\" (UniqueName: \"kubernetes.io/projected/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-kube-api-access-wsn6r\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.734978 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.734996 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.735010 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.040361 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerStarted","Data":"642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19"} Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.042717 4778 generic.go:334] "Generic (PLEG): container finished" podID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerID="416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f" exitCode=0 Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.042783 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68","Type":"ContainerDied","Data":"416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f"} Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.042814 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.042833 4778 scope.go:117] "RemoveContainer" containerID="416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.042819 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68","Type":"ContainerDied","Data":"d464bbdf554e5888aa55da417c9822d9fe3e925deb0939352bfe349e89da555d"} Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.074730 4778 scope.go:117] "RemoveContainer" containerID="552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.128910 4778 scope.go:117] "RemoveContainer" containerID="416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f" Mar 18 09:26:10 crc kubenswrapper[4778]: E0318 09:26:10.129456 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f\": container with ID starting with 416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f not found: ID does not exist" containerID="416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.129494 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f"} err="failed to get container status \"416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f\": rpc error: code = NotFound desc = could not find container \"416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f\": container with ID starting with 416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f not found: ID does not exist" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.129518 4778 scope.go:117] "RemoveContainer" containerID="552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38" Mar 18 09:26:10 crc kubenswrapper[4778]: E0318 09:26:10.130035 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38\": container with ID starting with 552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38 not found: ID does not exist" containerID="552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.130057 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38"} err="failed to get container status \"552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38\": rpc error: code = NotFound desc = could not find container \"552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38\": container with ID starting with 552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38 not found: ID does not exist" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.134301 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.150061 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.158275 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:10 crc kubenswrapper[4778]: E0318 09:26:10.158749 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-log" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.158769 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-log" Mar 18 09:26:10 crc kubenswrapper[4778]: E0318 09:26:10.158787 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-api" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.158794 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-api" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.158962 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-api" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.158984 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-log" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.159922 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.163621 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.163862 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.164890 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.165931 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.204674 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" path="/var/lib/kubelet/pods/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68/volumes" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.348650 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-public-tls-certs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.348730 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7208979b-1773-4741-8dab-00c621897016-logs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.348768 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-config-data\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.349441 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.349600 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.349640 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wv5\" (UniqueName: \"kubernetes.io/projected/7208979b-1773-4741-8dab-00c621897016-kube-api-access-72wv5\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.452455 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.453028 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.453054 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wv5\" (UniqueName: \"kubernetes.io/projected/7208979b-1773-4741-8dab-00c621897016-kube-api-access-72wv5\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.453220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-public-tls-certs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.453255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7208979b-1773-4741-8dab-00c621897016-logs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.453272 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-config-data\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.455123 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7208979b-1773-4741-8dab-00c621897016-logs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.460566 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.461187 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.465712 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-public-tls-certs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.468085 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-config-data\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.475087 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wv5\" (UniqueName: \"kubernetes.io/projected/7208979b-1773-4741-8dab-00c621897016-kube-api-access-72wv5\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.476989 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.631014 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.661093 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.938927 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:10 crc kubenswrapper[4778]: W0318 09:26:10.948665 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7208979b_1773_4741_8dab_00c621897016.slice/crio-5094a9653f4ef25b7c722e95f4034590d8e493b9399c47f1c2173d365ce1a395 WatchSource:0}: Error finding container 5094a9653f4ef25b7c722e95f4034590d8e493b9399c47f1c2173d365ce1a395: Status 404 returned error can't find the container with id 5094a9653f4ef25b7c722e95f4034590d8e493b9399c47f1c2173d365ce1a395 Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.066421 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerStarted","Data":"778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a"} Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.069605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7208979b-1773-4741-8dab-00c621897016","Type":"ContainerStarted","Data":"5094a9653f4ef25b7c722e95f4034590d8e493b9399c47f1c2173d365ce1a395"} Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.093134 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.414157 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-2nml6"] Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.416674 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.420033 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.420385 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.427904 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2nml6"] Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.486300 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-scripts\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.486504 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.486608 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kf8g\" (UniqueName: \"kubernetes.io/projected/32eb800e-69e8-4e39-ae5b-74a5eec87b00-kube-api-access-7kf8g\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.486785 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-config-data\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.589240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.589335 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kf8g\" (UniqueName: \"kubernetes.io/projected/32eb800e-69e8-4e39-ae5b-74a5eec87b00-kube-api-access-7kf8g\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.589413 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-config-data\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.589470 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-scripts\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.595168 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.596016 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-config-data\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.597521 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-scripts\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.608553 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kf8g\" (UniqueName: \"kubernetes.io/projected/32eb800e-69e8-4e39-ae5b-74a5eec87b00-kube-api-access-7kf8g\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.736495 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:12 crc kubenswrapper[4778]: I0318 09:26:12.100337 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7208979b-1773-4741-8dab-00c621897016","Type":"ContainerStarted","Data":"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f"} Mar 18 09:26:12 crc kubenswrapper[4778]: I0318 09:26:12.102816 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7208979b-1773-4741-8dab-00c621897016","Type":"ContainerStarted","Data":"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465"} Mar 18 09:26:12 crc kubenswrapper[4778]: I0318 09:26:12.135795 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.135772091 podStartE2EDuration="2.135772091s" podCreationTimestamp="2026-03-18 09:26:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:12.119913879 +0000 UTC m=+1438.694658719" watchObservedRunningTime="2026-03-18 09:26:12.135772091 +0000 UTC m=+1438.710516931" Mar 18 09:26:12 crc kubenswrapper[4778]: I0318 09:26:12.251932 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2nml6"] Mar 18 09:26:12 crc kubenswrapper[4778]: W0318 09:26:12.253436 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32eb800e_69e8_4e39_ae5b_74a5eec87b00.slice/crio-2ebc375f292c2ec65bb43218ef25687c949a64e0554485c66d56f633736ab6dd WatchSource:0}: Error finding container 2ebc375f292c2ec65bb43218ef25687c949a64e0554485c66d56f633736ab6dd: Status 404 returned error can't find the container with id 2ebc375f292c2ec65bb43218ef25687c949a64e0554485c66d56f633736ab6dd Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.128885 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerStarted","Data":"2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323"} Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.131826 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.129591 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="proxy-httpd" containerID="cri-o://2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323" gracePeriod=30 Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.129687 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-notification-agent" containerID="cri-o://642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19" gracePeriod=30 Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.129707 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="sg-core" containerID="cri-o://778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a" gracePeriod=30 Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.129004 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-central-agent" containerID="cri-o://d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5" gracePeriod=30 Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.133983 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2nml6" event={"ID":"32eb800e-69e8-4e39-ae5b-74a5eec87b00","Type":"ContainerStarted","Data":"3858148ddf213daa44ce8f206664d3360023f6d6f91e24bcfce11a24c0f0213c"} Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.134053 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2nml6" event={"ID":"32eb800e-69e8-4e39-ae5b-74a5eec87b00","Type":"ContainerStarted","Data":"2ebc375f292c2ec65bb43218ef25687c949a64e0554485c66d56f633736ab6dd"} Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.173733 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.456762556 podStartE2EDuration="6.173715654s" podCreationTimestamp="2026-03-18 09:26:07 +0000 UTC" firstStartedPulling="2026-03-18 09:26:08.131997361 +0000 UTC m=+1434.706742221" lastFinishedPulling="2026-03-18 09:26:11.848950469 +0000 UTC m=+1438.423695319" observedRunningTime="2026-03-18 09:26:13.173463578 +0000 UTC m=+1439.748208448" watchObservedRunningTime="2026-03-18 09:26:13.173715654 +0000 UTC m=+1439.748460504" Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.201746 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-2nml6" podStartSLOduration=2.201720697 podStartE2EDuration="2.201720697s" podCreationTimestamp="2026-03-18 09:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:13.198708795 +0000 UTC m=+1439.773453635" watchObservedRunningTime="2026-03-18 09:26:13.201720697 +0000 UTC m=+1439.776465547" Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.486471 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.557656 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-f44bz"] Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.557955 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerName="dnsmasq-dns" containerID="cri-o://e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e" gracePeriod=10 Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.078317 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.160763 4778 generic.go:334] "Generic (PLEG): container finished" podID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerID="2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323" exitCode=0 Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.160809 4778 generic.go:334] "Generic (PLEG): container finished" podID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerID="778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a" exitCode=2 Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.160820 4778 generic.go:334] "Generic (PLEG): container finished" podID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerID="642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19" exitCode=0 Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.160874 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerDied","Data":"2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323"} Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.160915 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerDied","Data":"778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a"} Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.160927 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerDied","Data":"642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19"} Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.163650 4778 generic.go:334] "Generic (PLEG): container finished" podID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerID="e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e" exitCode=0 Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.165046 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.168077 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" event={"ID":"ecb86d82-de0e-474c-9942-a8dff1f8739b","Type":"ContainerDied","Data":"e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e"} Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.168139 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" event={"ID":"ecb86d82-de0e-474c-9942-a8dff1f8739b","Type":"ContainerDied","Data":"a9b184c2a1d038cecffbf8235c70616929fdbf84785e09acda62d2b9837c969e"} Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.168168 4778 scope.go:117] "RemoveContainer" containerID="e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.196340 4778 scope.go:117] "RemoveContainer" containerID="fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.226678 4778 scope.go:117] "RemoveContainer" containerID="e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e" Mar 18 09:26:14 crc kubenswrapper[4778]: E0318 09:26:14.227183 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e\": container with ID starting with e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e not found: ID does not exist" containerID="e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.227285 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e"} err="failed to get container status \"e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e\": rpc error: code = NotFound desc = could not find container \"e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e\": container with ID starting with e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e not found: ID does not exist" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.227330 4778 scope.go:117] "RemoveContainer" containerID="fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6" Mar 18 09:26:14 crc kubenswrapper[4778]: E0318 09:26:14.228035 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6\": container with ID starting with fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6 not found: ID does not exist" containerID="fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.228109 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6"} err="failed to get container status \"fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6\": rpc error: code = NotFound desc = could not find container \"fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6\": container with ID starting with fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6 not found: ID does not exist" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.248176 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-dns-svc\") pod \"ecb86d82-de0e-474c-9942-a8dff1f8739b\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.248238 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9vgw\" (UniqueName: \"kubernetes.io/projected/ecb86d82-de0e-474c-9942-a8dff1f8739b-kube-api-access-s9vgw\") pod \"ecb86d82-de0e-474c-9942-a8dff1f8739b\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.248330 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-nb\") pod \"ecb86d82-de0e-474c-9942-a8dff1f8739b\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.248390 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-sb\") pod \"ecb86d82-de0e-474c-9942-a8dff1f8739b\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.248431 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-config\") pod \"ecb86d82-de0e-474c-9942-a8dff1f8739b\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.256729 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecb86d82-de0e-474c-9942-a8dff1f8739b-kube-api-access-s9vgw" (OuterVolumeSpecName: "kube-api-access-s9vgw") pod "ecb86d82-de0e-474c-9942-a8dff1f8739b" (UID: "ecb86d82-de0e-474c-9942-a8dff1f8739b"). InnerVolumeSpecName "kube-api-access-s9vgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.301118 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ecb86d82-de0e-474c-9942-a8dff1f8739b" (UID: "ecb86d82-de0e-474c-9942-a8dff1f8739b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.301858 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ecb86d82-de0e-474c-9942-a8dff1f8739b" (UID: "ecb86d82-de0e-474c-9942-a8dff1f8739b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.313925 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ecb86d82-de0e-474c-9942-a8dff1f8739b" (UID: "ecb86d82-de0e-474c-9942-a8dff1f8739b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.316084 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-config" (OuterVolumeSpecName: "config") pod "ecb86d82-de0e-474c-9942-a8dff1f8739b" (UID: "ecb86d82-de0e-474c-9942-a8dff1f8739b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.350811 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.350845 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.350854 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.350864 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.350877 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9vgw\" (UniqueName: \"kubernetes.io/projected/ecb86d82-de0e-474c-9942-a8dff1f8739b-kube-api-access-s9vgw\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.592916 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-f44bz"] Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.604170 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-f44bz"] Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.875850 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062687 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-config-data\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062764 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-run-httpd\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062860 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns6zx\" (UniqueName: \"kubernetes.io/projected/52df8ed1-aa17-446a-a3b4-e641f38a409d-kube-api-access-ns6zx\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062883 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-combined-ca-bundle\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062927 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-log-httpd\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062963 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-scripts\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062993 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-sg-core-conf-yaml\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.063017 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-ceilometer-tls-certs\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.063452 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.064082 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.064242 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.072561 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-scripts" (OuterVolumeSpecName: "scripts") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.073382 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52df8ed1-aa17-446a-a3b4-e641f38a409d-kube-api-access-ns6zx" (OuterVolumeSpecName: "kube-api-access-ns6zx") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "kube-api-access-ns6zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.105436 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.140087 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.165734 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns6zx\" (UniqueName: \"kubernetes.io/projected/52df8ed1-aa17-446a-a3b4-e641f38a409d-kube-api-access-ns6zx\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.165768 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.165781 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.165793 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.165802 4778 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.183534 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.183892 4778 generic.go:334] "Generic (PLEG): container finished" podID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerID="d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5" exitCode=0 Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.183973 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.183971 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerDied","Data":"d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5"} Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.184080 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerDied","Data":"ba77b580ef06a1e2bd00469a95b5561856fad5eff21c63e51dda25d025a61698"} Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.184103 4778 scope.go:117] "RemoveContainer" containerID="2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.203442 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-config-data" (OuterVolumeSpecName: "config-data") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.261873 4778 scope.go:117] "RemoveContainer" containerID="778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.266881 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.266907 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.300130 4778 scope.go:117] "RemoveContainer" containerID="642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.321877 4778 scope.go:117] "RemoveContainer" containerID="d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.357119 4778 scope.go:117] "RemoveContainer" containerID="2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.357650 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323\": container with ID starting with 2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323 not found: ID does not exist" containerID="2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.357695 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323"} err="failed to get container status \"2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323\": rpc error: code = NotFound desc = could not find container \"2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323\": container with ID starting with 2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323 not found: ID does not exist" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.357721 4778 scope.go:117] "RemoveContainer" containerID="778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.358759 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a\": container with ID starting with 778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a not found: ID does not exist" containerID="778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.358799 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a"} err="failed to get container status \"778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a\": rpc error: code = NotFound desc = could not find container \"778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a\": container with ID starting with 778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a not found: ID does not exist" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.358857 4778 scope.go:117] "RemoveContainer" containerID="642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.359494 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19\": container with ID starting with 642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19 not found: ID does not exist" containerID="642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.359541 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19"} err="failed to get container status \"642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19\": rpc error: code = NotFound desc = could not find container \"642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19\": container with ID starting with 642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19 not found: ID does not exist" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.359591 4778 scope.go:117] "RemoveContainer" containerID="d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.361074 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5\": container with ID starting with d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5 not found: ID does not exist" containerID="d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.361103 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5"} err="failed to get container status \"d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5\": rpc error: code = NotFound desc = could not find container \"d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5\": container with ID starting with d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5 not found: ID does not exist" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.534789 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.546306 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.564491 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.564994 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerName="init" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565014 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerName="init" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.565028 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerName="dnsmasq-dns" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565036 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerName="dnsmasq-dns" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.565049 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="proxy-httpd" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565054 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="proxy-httpd" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.565072 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="sg-core" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565078 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="sg-core" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.565090 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-central-agent" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565096 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-central-agent" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.565110 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-notification-agent" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565118 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-notification-agent" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565361 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-central-agent" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565376 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="sg-core" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565388 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-notification-agent" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565396 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="proxy-httpd" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565410 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerName="dnsmasq-dns" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.569779 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.574912 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.575068 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.575891 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.622329 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.673707 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-config-data\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.673822 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.673870 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7799h\" (UniqueName: \"kubernetes.io/projected/d6f19125-b59e-49f9-8819-0cad52114840-kube-api-access-7799h\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.673944 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-run-httpd\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.674001 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.674301 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-log-httpd\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.674376 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-scripts\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.674535 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776094 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-scripts\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776171 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776218 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-config-data\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776247 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776269 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7799h\" (UniqueName: \"kubernetes.io/projected/d6f19125-b59e-49f9-8819-0cad52114840-kube-api-access-7799h\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776295 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-run-httpd\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776318 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-log-httpd\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776790 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-log-httpd\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.780370 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-run-httpd\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.781401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.782542 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-config-data\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.784283 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.784625 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.789473 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-scripts\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.800608 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7799h\" (UniqueName: \"kubernetes.io/projected/d6f19125-b59e-49f9-8819-0cad52114840-kube-api-access-7799h\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.909837 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:16 crc kubenswrapper[4778]: I0318 09:26:16.199604 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" path="/var/lib/kubelet/pods/52df8ed1-aa17-446a-a3b4-e641f38a409d/volumes" Mar 18 09:26:16 crc kubenswrapper[4778]: I0318 09:26:16.201279 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" path="/var/lib/kubelet/pods/ecb86d82-de0e-474c-9942-a8dff1f8739b/volumes" Mar 18 09:26:16 crc kubenswrapper[4778]: I0318 09:26:16.397442 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:17 crc kubenswrapper[4778]: I0318 09:26:17.212299 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerStarted","Data":"72617aca4aa8400e7af4a60ff63ae0a42982c759ce5821a76119f425b2752b6a"} Mar 18 09:26:17 crc kubenswrapper[4778]: I0318 09:26:17.212382 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerStarted","Data":"3887f968114761a1654028b0a71448b233a4c1f413820ca59edf51160a07bebd"} Mar 18 09:26:18 crc kubenswrapper[4778]: I0318 09:26:18.223962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerStarted","Data":"d0ab691768395396ab9cd2b17a7818f8101a3f17ea12d03f3697e6b6b5c5a0b8"} Mar 18 09:26:18 crc kubenswrapper[4778]: I0318 09:26:18.226069 4778 generic.go:334] "Generic (PLEG): container finished" podID="32eb800e-69e8-4e39-ae5b-74a5eec87b00" containerID="3858148ddf213daa44ce8f206664d3360023f6d6f91e24bcfce11a24c0f0213c" exitCode=0 Mar 18 09:26:18 crc kubenswrapper[4778]: I0318 09:26:18.226110 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2nml6" event={"ID":"32eb800e-69e8-4e39-ae5b-74a5eec87b00","Type":"ContainerDied","Data":"3858148ddf213daa44ce8f206664d3360023f6d6f91e24bcfce11a24c0f0213c"} Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.308317 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerStarted","Data":"28ea8f018593b4e5ea693b13884510308b8fe6c9f56dc77f633042aa7b3c2142"} Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.662431 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.810016 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-scripts\") pod \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.810442 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kf8g\" (UniqueName: \"kubernetes.io/projected/32eb800e-69e8-4e39-ae5b-74a5eec87b00-kube-api-access-7kf8g\") pod \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.810989 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-combined-ca-bundle\") pod \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.811114 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-config-data\") pod \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.817325 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-scripts" (OuterVolumeSpecName: "scripts") pod "32eb800e-69e8-4e39-ae5b-74a5eec87b00" (UID: "32eb800e-69e8-4e39-ae5b-74a5eec87b00"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.820424 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32eb800e-69e8-4e39-ae5b-74a5eec87b00-kube-api-access-7kf8g" (OuterVolumeSpecName: "kube-api-access-7kf8g") pod "32eb800e-69e8-4e39-ae5b-74a5eec87b00" (UID: "32eb800e-69e8-4e39-ae5b-74a5eec87b00"). InnerVolumeSpecName "kube-api-access-7kf8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.844345 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32eb800e-69e8-4e39-ae5b-74a5eec87b00" (UID: "32eb800e-69e8-4e39-ae5b-74a5eec87b00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.867408 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-config-data" (OuterVolumeSpecName: "config-data") pod "32eb800e-69e8-4e39-ae5b-74a5eec87b00" (UID: "32eb800e-69e8-4e39-ae5b-74a5eec87b00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.916065 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kf8g\" (UniqueName: \"kubernetes.io/projected/32eb800e-69e8-4e39-ae5b-74a5eec87b00-kube-api-access-7kf8g\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.916122 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.916133 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.916144 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.326063 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2nml6" event={"ID":"32eb800e-69e8-4e39-ae5b-74a5eec87b00","Type":"ContainerDied","Data":"2ebc375f292c2ec65bb43218ef25687c949a64e0554485c66d56f633736ab6dd"} Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.326502 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ebc375f292c2ec65bb43218ef25687c949a64e0554485c66d56f633736ab6dd" Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.326128 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.329407 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerStarted","Data":"345ccf529d04d20baa9e9e06ea792464e3dc120ee1042052fe45190d2b838b33"} Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.329847 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.471028 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7986296149999998 podStartE2EDuration="5.470998417s" podCreationTimestamp="2026-03-18 09:26:15 +0000 UTC" firstStartedPulling="2026-03-18 09:26:16.402448062 +0000 UTC m=+1442.977192902" lastFinishedPulling="2026-03-18 09:26:20.074816864 +0000 UTC m=+1446.649561704" observedRunningTime="2026-03-18 09:26:20.35403479 +0000 UTC m=+1446.928779650" watchObservedRunningTime="2026-03-18 09:26:20.470998417 +0000 UTC m=+1447.045743257" Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.472150 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.472670 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-log" containerID="cri-o://91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465" gracePeriod=30 Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.472777 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-api" containerID="cri-o://6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f" gracePeriod=30 Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.522157 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.522566 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-log" containerID="cri-o://1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2" gracePeriod=30 Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.522716 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-metadata" containerID="cri-o://b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1" gracePeriod=30 Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.540272 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.540630 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" containerName="nova-scheduler-scheduler" containerID="cri-o://cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" gracePeriod=30 Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.047883 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.072750 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.077680 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.077752 4778 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" containerName="nova-scheduler-scheduler" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.278578 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.349848 4778 generic.go:334] "Generic (PLEG): container finished" podID="7208979b-1773-4741-8dab-00c621897016" containerID="6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f" exitCode=0 Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.349879 4778 generic.go:334] "Generic (PLEG): container finished" podID="7208979b-1773-4741-8dab-00c621897016" containerID="91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465" exitCode=143 Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.349909 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7208979b-1773-4741-8dab-00c621897016","Type":"ContainerDied","Data":"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f"} Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.349950 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.349968 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7208979b-1773-4741-8dab-00c621897016","Type":"ContainerDied","Data":"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465"} Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.349983 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7208979b-1773-4741-8dab-00c621897016","Type":"ContainerDied","Data":"5094a9653f4ef25b7c722e95f4034590d8e493b9399c47f1c2173d365ce1a395"} Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.350005 4778 scope.go:117] "RemoveContainer" containerID="6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.351800 4778 generic.go:334] "Generic (PLEG): container finished" podID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerID="1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2" exitCode=143 Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.351946 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e","Type":"ContainerDied","Data":"1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2"} Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.372132 4778 scope.go:117] "RemoveContainer" containerID="91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.391379 4778 scope.go:117] "RemoveContainer" containerID="6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f" Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.392059 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f\": container with ID starting with 6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f not found: ID does not exist" containerID="6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.392092 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f"} err="failed to get container status \"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f\": rpc error: code = NotFound desc = could not find container \"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f\": container with ID starting with 6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f not found: ID does not exist" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.392115 4778 scope.go:117] "RemoveContainer" containerID="91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465" Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.392806 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465\": container with ID starting with 91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465 not found: ID does not exist" containerID="91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.392868 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465"} err="failed to get container status \"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465\": rpc error: code = NotFound desc = could not find container \"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465\": container with ID starting with 91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465 not found: ID does not exist" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.392910 4778 scope.go:117] "RemoveContainer" containerID="6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.393508 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f"} err="failed to get container status \"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f\": rpc error: code = NotFound desc = could not find container \"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f\": container with ID starting with 6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f not found: ID does not exist" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.393555 4778 scope.go:117] "RemoveContainer" containerID="91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.394005 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465"} err="failed to get container status \"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465\": rpc error: code = NotFound desc = could not find container \"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465\": container with ID starting with 91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465 not found: ID does not exist" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.450881 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-internal-tls-certs\") pod \"7208979b-1773-4741-8dab-00c621897016\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.451003 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-public-tls-certs\") pod \"7208979b-1773-4741-8dab-00c621897016\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.451175 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-combined-ca-bundle\") pod \"7208979b-1773-4741-8dab-00c621897016\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.451256 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7208979b-1773-4741-8dab-00c621897016-logs\") pod \"7208979b-1773-4741-8dab-00c621897016\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.451342 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-config-data\") pod \"7208979b-1773-4741-8dab-00c621897016\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.451429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72wv5\" (UniqueName: \"kubernetes.io/projected/7208979b-1773-4741-8dab-00c621897016-kube-api-access-72wv5\") pod \"7208979b-1773-4741-8dab-00c621897016\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.452107 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7208979b-1773-4741-8dab-00c621897016-logs" (OuterVolumeSpecName: "logs") pod "7208979b-1773-4741-8dab-00c621897016" (UID: "7208979b-1773-4741-8dab-00c621897016"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.453038 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7208979b-1773-4741-8dab-00c621897016-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.457563 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7208979b-1773-4741-8dab-00c621897016-kube-api-access-72wv5" (OuterVolumeSpecName: "kube-api-access-72wv5") pod "7208979b-1773-4741-8dab-00c621897016" (UID: "7208979b-1773-4741-8dab-00c621897016"). InnerVolumeSpecName "kube-api-access-72wv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.480237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7208979b-1773-4741-8dab-00c621897016" (UID: "7208979b-1773-4741-8dab-00c621897016"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.486190 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-config-data" (OuterVolumeSpecName: "config-data") pod "7208979b-1773-4741-8dab-00c621897016" (UID: "7208979b-1773-4741-8dab-00c621897016"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.516417 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7208979b-1773-4741-8dab-00c621897016" (UID: "7208979b-1773-4741-8dab-00c621897016"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.529799 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7208979b-1773-4741-8dab-00c621897016" (UID: "7208979b-1773-4741-8dab-00c621897016"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.554912 4778 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.554946 4778 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.554954 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.554964 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.554973 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72wv5\" (UniqueName: \"kubernetes.io/projected/7208979b-1773-4741-8dab-00c621897016-kube-api-access-72wv5\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.680552 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.695556 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.706984 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.707436 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-log" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.707454 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-log" Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.707479 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32eb800e-69e8-4e39-ae5b-74a5eec87b00" containerName="nova-manage" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.707487 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="32eb800e-69e8-4e39-ae5b-74a5eec87b00" containerName="nova-manage" Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.707496 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-api" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.707501 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-api" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.707711 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-log" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.707737 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="32eb800e-69e8-4e39-ae5b-74a5eec87b00" containerName="nova-manage" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.707752 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-api" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.708846 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.711392 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.711493 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.711675 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.716035 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.871367 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.872346 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.872499 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a702c51-b7a6-4094-9d34-519102e1cf91-logs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.872600 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxhbn\" (UniqueName: \"kubernetes.io/projected/8a702c51-b7a6-4094-9d34-519102e1cf91-kube-api-access-lxhbn\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.872797 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-config-data\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.872926 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-public-tls-certs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.974904 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-public-tls-certs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.975455 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.975613 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.975753 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a702c51-b7a6-4094-9d34-519102e1cf91-logs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.975875 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxhbn\" (UniqueName: \"kubernetes.io/projected/8a702c51-b7a6-4094-9d34-519102e1cf91-kube-api-access-lxhbn\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.976024 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-config-data\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.976286 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a702c51-b7a6-4094-9d34-519102e1cf91-logs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.980126 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.981222 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-config-data\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.982018 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.989733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-public-tls-certs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.997700 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxhbn\" (UniqueName: \"kubernetes.io/projected/8a702c51-b7a6-4094-9d34-519102e1cf91-kube-api-access-lxhbn\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:22 crc kubenswrapper[4778]: I0318 09:26:22.077431 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:22 crc kubenswrapper[4778]: I0318 09:26:22.212116 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7208979b-1773-4741-8dab-00c621897016" path="/var/lib/kubelet/pods/7208979b-1773-4741-8dab-00c621897016/volumes" Mar 18 09:26:22 crc kubenswrapper[4778]: I0318 09:26:22.544760 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:22 crc kubenswrapper[4778]: W0318 09:26:22.553091 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a702c51_b7a6_4094_9d34_519102e1cf91.slice/crio-7715785cb5d43f547a06695d17c8835cd5212c03e6fa259322ada2a1dd76c24b WatchSource:0}: Error finding container 7715785cb5d43f547a06695d17c8835cd5212c03e6fa259322ada2a1dd76c24b: Status 404 returned error can't find the container with id 7715785cb5d43f547a06695d17c8835cd5212c03e6fa259322ada2a1dd76c24b Mar 18 09:26:23 crc kubenswrapper[4778]: I0318 09:26:23.382789 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a702c51-b7a6-4094-9d34-519102e1cf91","Type":"ContainerStarted","Data":"4cf481707b208cd4f8da4862b8ab7b0cc662181c8e30447acc8ec31cd4d1bfa4"} Mar 18 09:26:23 crc kubenswrapper[4778]: I0318 09:26:23.383181 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a702c51-b7a6-4094-9d34-519102e1cf91","Type":"ContainerStarted","Data":"5b977842c7b022e09540c13c0c1623d4faf93b8ba5a2f00023b87208c2df8aa4"} Mar 18 09:26:23 crc kubenswrapper[4778]: I0318 09:26:23.383220 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a702c51-b7a6-4094-9d34-519102e1cf91","Type":"ContainerStarted","Data":"7715785cb5d43f547a06695d17c8835cd5212c03e6fa259322ada2a1dd76c24b"} Mar 18 09:26:23 crc kubenswrapper[4778]: I0318 09:26:23.415652 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.415628426 podStartE2EDuration="2.415628426s" podCreationTimestamp="2026-03-18 09:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:23.403748841 +0000 UTC m=+1449.978493711" watchObservedRunningTime="2026-03-18 09:26:23.415628426 +0000 UTC m=+1449.990373276" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.152827 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.224803 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-logs\") pod \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.225311 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-logs" (OuterVolumeSpecName: "logs") pod "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" (UID: "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.225770 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ljtb\" (UniqueName: \"kubernetes.io/projected/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-kube-api-access-7ljtb\") pod \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.225851 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-config-data\") pod \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.226354 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-combined-ca-bundle\") pod \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.226410 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-nova-metadata-tls-certs\") pod \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.227419 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.238036 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-kube-api-access-7ljtb" (OuterVolumeSpecName: "kube-api-access-7ljtb") pod "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" (UID: "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e"). InnerVolumeSpecName "kube-api-access-7ljtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.267760 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" (UID: "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.294881 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-config-data" (OuterVolumeSpecName: "config-data") pod "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" (UID: "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.337133 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ljtb\" (UniqueName: \"kubernetes.io/projected/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-kube-api-access-7ljtb\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.337166 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.337176 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.373645 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" (UID: "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.394428 4778 generic.go:334] "Generic (PLEG): container finished" podID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerID="b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1" exitCode=0 Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.395403 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.397379 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e","Type":"ContainerDied","Data":"b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1"} Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.397426 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e","Type":"ContainerDied","Data":"290b792dbc94b49540da6dec52821c0018cd2340a491c925285465d74334b24e"} Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.397448 4778 scope.go:117] "RemoveContainer" containerID="b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.432952 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.439035 4778 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.441377 4778 scope.go:117] "RemoveContainer" containerID="1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.455056 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.465677 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:26:24 crc kubenswrapper[4778]: E0318 09:26:24.466081 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-metadata" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.466106 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-metadata" Mar 18 09:26:24 crc kubenswrapper[4778]: E0318 09:26:24.466115 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-log" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.466121 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-log" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.466306 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-log" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.466321 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-metadata" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.467251 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.473317 4778 scope.go:117] "RemoveContainer" containerID="b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.473734 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.473791 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 09:26:24 crc kubenswrapper[4778]: E0318 09:26:24.474207 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1\": container with ID starting with b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1 not found: ID does not exist" containerID="b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.474241 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1"} err="failed to get container status \"b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1\": rpc error: code = NotFound desc = could not find container \"b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1\": container with ID starting with b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1 not found: ID does not exist" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.474260 4778 scope.go:117] "RemoveContainer" containerID="1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2" Mar 18 09:26:24 crc kubenswrapper[4778]: E0318 09:26:24.474918 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2\": container with ID starting with 1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2 not found: ID does not exist" containerID="1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.474941 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2"} err="failed to get container status \"1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2\": rpc error: code = NotFound desc = could not find container \"1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2\": container with ID starting with 1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2 not found: ID does not exist" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.482320 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.642777 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-config-data\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.642862 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.642932 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx5lz\" (UniqueName: \"kubernetes.io/projected/28f01ca6-f7d2-4de3-9aa9-256803533b80-kube-api-access-xx5lz\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.642986 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.643049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f01ca6-f7d2-4de3-9aa9-256803533b80-logs\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.744382 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-config-data\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.744439 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.744478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx5lz\" (UniqueName: \"kubernetes.io/projected/28f01ca6-f7d2-4de3-9aa9-256803533b80-kube-api-access-xx5lz\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.744507 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.744540 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f01ca6-f7d2-4de3-9aa9-256803533b80-logs\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.745002 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f01ca6-f7d2-4de3-9aa9-256803533b80-logs\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.749956 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-config-data\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.750471 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.751615 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.764891 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx5lz\" (UniqueName: \"kubernetes.io/projected/28f01ca6-f7d2-4de3-9aa9-256803533b80-kube-api-access-xx5lz\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.796487 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.277283 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.356983 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl9dz\" (UniqueName: \"kubernetes.io/projected/dd2218f5-0310-4e4c-8edc-d13c25707ea5-kube-api-access-vl9dz\") pod \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.357071 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-config-data\") pod \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.357135 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-combined-ca-bundle\") pod \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.404826 4778 generic.go:334] "Generic (PLEG): container finished" podID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" exitCode=0 Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.404887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd2218f5-0310-4e4c-8edc-d13c25707ea5","Type":"ContainerDied","Data":"cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413"} Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.404917 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd2218f5-0310-4e4c-8edc-d13c25707ea5","Type":"ContainerDied","Data":"b346fc3122696affb646e267887f25b4c0332b9b1cc46fa78a99f5366632ff21"} Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.404933 4778 scope.go:117] "RemoveContainer" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.404961 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.411642 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2218f5-0310-4e4c-8edc-d13c25707ea5-kube-api-access-vl9dz" (OuterVolumeSpecName: "kube-api-access-vl9dz") pod "dd2218f5-0310-4e4c-8edc-d13c25707ea5" (UID: "dd2218f5-0310-4e4c-8edc-d13c25707ea5"). InnerVolumeSpecName "kube-api-access-vl9dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.416660 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-config-data" (OuterVolumeSpecName: "config-data") pod "dd2218f5-0310-4e4c-8edc-d13c25707ea5" (UID: "dd2218f5-0310-4e4c-8edc-d13c25707ea5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.417005 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd2218f5-0310-4e4c-8edc-d13c25707ea5" (UID: "dd2218f5-0310-4e4c-8edc-d13c25707ea5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.435841 4778 scope.go:117] "RemoveContainer" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" Mar 18 09:26:25 crc kubenswrapper[4778]: E0318 09:26:25.442772 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413\": container with ID starting with cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413 not found: ID does not exist" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.442877 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413"} err="failed to get container status \"cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413\": rpc error: code = NotFound desc = could not find container \"cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413\": container with ID starting with cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413 not found: ID does not exist" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.470036 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.470092 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl9dz\" (UniqueName: \"kubernetes.io/projected/dd2218f5-0310-4e4c-8edc-d13c25707ea5-kube-api-access-vl9dz\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.470112 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.472639 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.767233 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.781792 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.791831 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:26:25 crc kubenswrapper[4778]: E0318 09:26:25.792404 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" containerName="nova-scheduler-scheduler" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.792432 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" containerName="nova-scheduler-scheduler" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.792624 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" containerName="nova-scheduler-scheduler" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.793563 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.796801 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.807047 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.878130 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1623d1-2084-419e-b36a-80930113a280-config-data\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.878486 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1623d1-2084-419e-b36a-80930113a280-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.878713 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpq4z\" (UniqueName: \"kubernetes.io/projected/9b1623d1-2084-419e-b36a-80930113a280-kube-api-access-mpq4z\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.981342 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1623d1-2084-419e-b36a-80930113a280-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.982075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpq4z\" (UniqueName: \"kubernetes.io/projected/9b1623d1-2084-419e-b36a-80930113a280-kube-api-access-mpq4z\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.982289 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1623d1-2084-419e-b36a-80930113a280-config-data\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.986516 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1623d1-2084-419e-b36a-80930113a280-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.987537 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1623d1-2084-419e-b36a-80930113a280-config-data\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.008914 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpq4z\" (UniqueName: \"kubernetes.io/projected/9b1623d1-2084-419e-b36a-80930113a280-kube-api-access-mpq4z\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.110371 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.209171 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" path="/var/lib/kubelet/pods/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e/volumes" Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.210353 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" path="/var/lib/kubelet/pods/dd2218f5-0310-4e4c-8edc-d13c25707ea5/volumes" Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.423221 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28f01ca6-f7d2-4de3-9aa9-256803533b80","Type":"ContainerStarted","Data":"cb52cc57268289cd695bd462776bcac79b728c35c7228ef7b1b59ad2d9aa1d33"} Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.423610 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28f01ca6-f7d2-4de3-9aa9-256803533b80","Type":"ContainerStarted","Data":"45f9eb07f3611369455207e143d7f684b92e8fb675c65c647a3e90225a551a0f"} Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.423623 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28f01ca6-f7d2-4de3-9aa9-256803533b80","Type":"ContainerStarted","Data":"2c451f78461244dc8809e7e9b61abbdcb51053b49e02275879d48a3bf93a2146"} Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.443752 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.443732519 podStartE2EDuration="2.443732519s" podCreationTimestamp="2026-03-18 09:26:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:26.441629371 +0000 UTC m=+1453.016374211" watchObservedRunningTime="2026-03-18 09:26:26.443732519 +0000 UTC m=+1453.018477359" Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.629563 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:26:27 crc kubenswrapper[4778]: I0318 09:26:27.436161 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b1623d1-2084-419e-b36a-80930113a280","Type":"ContainerStarted","Data":"ac69936b9763ba949a97946cdc99dfe2182bce7d03dd02c1f5ec30e63347dd8e"} Mar 18 09:26:27 crc kubenswrapper[4778]: I0318 09:26:27.436731 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b1623d1-2084-419e-b36a-80930113a280","Type":"ContainerStarted","Data":"a5bf04c00c907bf04465611a4b915a3a63e88e8b73433bf7e23c0b7d7f8aebb9"} Mar 18 09:26:27 crc kubenswrapper[4778]: I0318 09:26:27.469005 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.4689710160000002 podStartE2EDuration="2.468971016s" podCreationTimestamp="2026-03-18 09:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:27.453574786 +0000 UTC m=+1454.028319636" watchObservedRunningTime="2026-03-18 09:26:27.468971016 +0000 UTC m=+1454.043715866" Mar 18 09:26:28 crc kubenswrapper[4778]: I0318 09:26:28.068347 4778 scope.go:117] "RemoveContainer" containerID="3c3567d850d5fbfcade4077c9139b7f651174e9261a4f7a1ab2f40e22fce3000" Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.147627 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.148112 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.148181 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.149367 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84af870879787d3b66c88494739078983637b86ff0a5bba3998c98c4487f8853"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.149465 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://84af870879787d3b66c88494739078983637b86ff0a5bba3998c98c4487f8853" gracePeriod=600 Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.473740 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="84af870879787d3b66c88494739078983637b86ff0a5bba3998c98c4487f8853" exitCode=0 Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.473841 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"84af870879787d3b66c88494739078983637b86ff0a5bba3998c98c4487f8853"} Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.474121 4778 scope.go:117] "RemoveContainer" containerID="7154b13c7f4cd2402d0304ed7b86d22cac3e7b544f6222d5ad0d8d8ac0463fe2" Mar 18 09:26:31 crc kubenswrapper[4778]: I0318 09:26:31.111588 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 09:26:31 crc kubenswrapper[4778]: I0318 09:26:31.491173 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492"} Mar 18 09:26:32 crc kubenswrapper[4778]: I0318 09:26:32.078188 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:26:32 crc kubenswrapper[4778]: I0318 09:26:32.078555 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:26:33 crc kubenswrapper[4778]: I0318 09:26:33.099620 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8a702c51-b7a6-4094-9d34-519102e1cf91" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:26:33 crc kubenswrapper[4778]: I0318 09:26:33.099622 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8a702c51-b7a6-4094-9d34-519102e1cf91" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:26:34 crc kubenswrapper[4778]: I0318 09:26:34.797795 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 09:26:34 crc kubenswrapper[4778]: I0318 09:26:34.798335 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 09:26:35 crc kubenswrapper[4778]: I0318 09:26:35.810507 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="28f01ca6-f7d2-4de3-9aa9-256803533b80" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:26:35 crc kubenswrapper[4778]: I0318 09:26:35.810531 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="28f01ca6-f7d2-4de3-9aa9-256803533b80" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:26:36 crc kubenswrapper[4778]: I0318 09:26:36.111617 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 09:26:36 crc kubenswrapper[4778]: I0318 09:26:36.137800 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 09:26:36 crc kubenswrapper[4778]: I0318 09:26:36.585192 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 09:26:40 crc kubenswrapper[4778]: I0318 09:26:40.077725 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 09:26:40 crc kubenswrapper[4778]: I0318 09:26:40.078692 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 09:26:42 crc kubenswrapper[4778]: I0318 09:26:42.089769 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 09:26:42 crc kubenswrapper[4778]: I0318 09:26:42.092167 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 09:26:42 crc kubenswrapper[4778]: I0318 09:26:42.100599 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 09:26:42 crc kubenswrapper[4778]: I0318 09:26:42.621774 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 09:26:42 crc kubenswrapper[4778]: I0318 09:26:42.797403 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 09:26:42 crc kubenswrapper[4778]: I0318 09:26:42.797453 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 09:26:44 crc kubenswrapper[4778]: I0318 09:26:44.807359 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 09:26:44 crc kubenswrapper[4778]: I0318 09:26:44.809771 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 09:26:44 crc kubenswrapper[4778]: I0318 09:26:44.819428 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 09:26:45 crc kubenswrapper[4778]: I0318 09:26:45.655123 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 09:26:45 crc kubenswrapper[4778]: I0318 09:26:45.921825 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 09:26:54 crc kubenswrapper[4778]: I0318 09:26:54.409275 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:26:56 crc kubenswrapper[4778]: I0318 09:26:56.204422 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:26:58 crc kubenswrapper[4778]: I0318 09:26:58.500408 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="rabbitmq" containerID="cri-o://29a1b3eb3844657a811126fd222acecd2b3045c5c60889965b169deb2aec2c9a" gracePeriod=604796 Mar 18 09:27:00 crc kubenswrapper[4778]: I0318 09:27:00.078835 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 18 09:27:00 crc kubenswrapper[4778]: I0318 09:27:00.707224 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerName="rabbitmq" containerID="cri-o://4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48" gracePeriod=604796 Mar 18 09:27:04 crc kubenswrapper[4778]: I0318 09:27:04.882729 4778 generic.go:334] "Generic (PLEG): container finished" podID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerID="29a1b3eb3844657a811126fd222acecd2b3045c5c60889965b169deb2aec2c9a" exitCode=0 Mar 18 09:27:04 crc kubenswrapper[4778]: I0318 09:27:04.882840 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57955df9-f0c5-4cfc-91fd-135771be7ed2","Type":"ContainerDied","Data":"29a1b3eb3844657a811126fd222acecd2b3045c5c60889965b169deb2aec2c9a"} Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.148126 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.173703 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-config-data\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.173787 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-confd\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.173824 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-plugins\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.173859 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-plugins-conf\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.173883 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-tls\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.173925 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trc2k\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-kube-api-access-trc2k\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.174039 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57955df9-f0c5-4cfc-91fd-135771be7ed2-pod-info\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.174114 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.174144 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-erlang-cookie\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.174177 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57955df9-f0c5-4cfc-91fd-135771be7ed2-erlang-cookie-secret\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.174268 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-server-conf\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.184576 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.184905 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.185809 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.186449 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.187140 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.193173 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-kube-api-access-trc2k" (OuterVolumeSpecName: "kube-api-access-trc2k") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "kube-api-access-trc2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.198028 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57955df9-f0c5-4cfc-91fd-135771be7ed2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.204943 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/57955df9-f0c5-4cfc-91fd-135771be7ed2-pod-info" (OuterVolumeSpecName: "pod-info") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.227323 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-config-data" (OuterVolumeSpecName: "config-data") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276847 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276880 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276890 4778 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57955df9-f0c5-4cfc-91fd-135771be7ed2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276899 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276908 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276917 4778 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276926 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276935 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trc2k\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-kube-api-access-trc2k\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276944 4778 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57955df9-f0c5-4cfc-91fd-135771be7ed2-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.279993 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-server-conf" (OuterVolumeSpecName: "server-conf") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.300945 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.366783 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.378512 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.378543 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.378553 4778 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.894913 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57955df9-f0c5-4cfc-91fd-135771be7ed2","Type":"ContainerDied","Data":"9908701ec8146f186284a44b30a5e2c02918471c220798a76b04c8de271c7850"} Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.895500 4778 scope.go:117] "RemoveContainer" containerID="29a1b3eb3844657a811126fd222acecd2b3045c5c60889965b169deb2aec2c9a" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.894956 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.942410 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.951782 4778 scope.go:117] "RemoveContainer" containerID="3698e124eba53a15e3f16dfe6346805545443e4b8ce94d12254d508326508979" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.952364 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.987786 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:27:05 crc kubenswrapper[4778]: E0318 09:27:05.988217 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="rabbitmq" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.988240 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="rabbitmq" Mar 18 09:27:05 crc kubenswrapper[4778]: E0318 09:27:05.988287 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="setup-container" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.988297 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="setup-container" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.988511 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="rabbitmq" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.989751 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.994289 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.994761 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.996144 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.996365 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.996144 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.998427 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-b2npt" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.998630 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.011767 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191471 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191519 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191583 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0191a745-1fe2-4a1c-b007-96525ad39787-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191609 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191648 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191676 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191697 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0191a745-1fe2-4a1c-b007-96525ad39787-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191714 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-config-data\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191742 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.192043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.192223 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v4n2\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-kube-api-access-9v4n2\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.201249 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" path="/var/lib/kubelet/pods/57955df9-f0c5-4cfc-91fd-135771be7ed2/volumes" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294549 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294607 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0191a745-1fe2-4a1c-b007-96525ad39787-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294645 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294674 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294711 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294736 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0191a745-1fe2-4a1c-b007-96525ad39787-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294759 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-config-data\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294795 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294922 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v4n2\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-kube-api-access-9v4n2\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.295698 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.295945 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.296489 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-config-data\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.296690 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.296864 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.297588 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.300268 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0191a745-1fe2-4a1c-b007-96525ad39787-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.304233 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.304527 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.306974 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0191a745-1fe2-4a1c-b007-96525ad39787-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.322216 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v4n2\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-kube-api-access-9v4n2\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.332258 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.358644 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.883714 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.258074 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.417781 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/671fe1be-f3dd-475e-8c48-a1d1db510aef-erlang-cookie-secret\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.417899 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.417946 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/671fe1be-f3dd-475e-8c48-a1d1db510aef-pod-info\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418008 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-erlang-cookie\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418028 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-plugins\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418049 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-config-data\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418071 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-tls\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418106 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-confd\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418146 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhk6k\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-kube-api-access-mhk6k\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418175 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-plugins-conf\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418191 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-server-conf\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418909 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.420176 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.420691 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.424168 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.425781 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671fe1be-f3dd-475e-8c48-a1d1db510aef-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.425796 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.427331 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-kube-api-access-mhk6k" (OuterVolumeSpecName: "kube-api-access-mhk6k") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "kube-api-access-mhk6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.429439 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/671fe1be-f3dd-475e-8c48-a1d1db510aef-pod-info" (OuterVolumeSpecName: "pod-info") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.463849 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-config-data" (OuterVolumeSpecName: "config-data") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.496909 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-server-conf" (OuterVolumeSpecName: "server-conf") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520451 4778 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/671fe1be-f3dd-475e-8c48-a1d1db510aef-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520504 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520515 4778 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/671fe1be-f3dd-475e-8c48-a1d1db510aef-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520538 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520549 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520560 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520568 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520576 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhk6k\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-kube-api-access-mhk6k\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520587 4778 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520596 4778 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.540680 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.554779 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.622565 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.622949 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.913214 4778 generic.go:334] "Generic (PLEG): container finished" podID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerID="4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48" exitCode=0 Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.913790 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"671fe1be-f3dd-475e-8c48-a1d1db510aef","Type":"ContainerDied","Data":"4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48"} Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.913887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"671fe1be-f3dd-475e-8c48-a1d1db510aef","Type":"ContainerDied","Data":"fd11d5ce698fa7411014a038329cc43022813b8f74df72dacf0e69cfc2473b23"} Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.913962 4778 scope.go:117] "RemoveContainer" containerID="4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.914132 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.933796 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0191a745-1fe2-4a1c-b007-96525ad39787","Type":"ContainerStarted","Data":"75903085f19cb6ef05c70fe9495d00471dd6c2b8a106f236b36f593571d1a81d"} Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.071586 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.083036 4778 scope.go:117] "RemoveContainer" containerID="fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.090861 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:27:08 crc kubenswrapper[4778]: E0318 09:27:08.107830 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671fe1be_f3dd_475e_8c48_a1d1db510aef.slice/crio-fd11d5ce698fa7411014a038329cc43022813b8f74df72dacf0e69cfc2473b23\": RecentStats: unable to find data in memory cache]" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.109845 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:27:08 crc kubenswrapper[4778]: E0318 09:27:08.110376 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerName="rabbitmq" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.110397 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerName="rabbitmq" Mar 18 09:27:08 crc kubenswrapper[4778]: E0318 09:27:08.110415 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerName="setup-container" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.110424 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerName="setup-container" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.110672 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerName="rabbitmq" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.111863 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.119944 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.120108 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.120451 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.120604 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.120763 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.120917 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.122221 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7f9jg" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.128990 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.197134 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" path="/var/lib/kubelet/pods/671fe1be-f3dd-475e-8c48-a1d1db510aef/volumes" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.204980 4778 scope.go:117] "RemoveContainer" containerID="4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48" Mar 18 09:27:08 crc kubenswrapper[4778]: E0318 09:27:08.205754 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48\": container with ID starting with 4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48 not found: ID does not exist" containerID="4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.205785 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48"} err="failed to get container status \"4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48\": rpc error: code = NotFound desc = could not find container \"4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48\": container with ID starting with 4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48 not found: ID does not exist" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.205805 4778 scope.go:117] "RemoveContainer" containerID="fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7" Mar 18 09:27:08 crc kubenswrapper[4778]: E0318 09:27:08.206085 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7\": container with ID starting with fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7 not found: ID does not exist" containerID="fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.206104 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7"} err="failed to get container status \"fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7\": rpc error: code = NotFound desc = could not find container \"fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7\": container with ID starting with fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7 not found: ID does not exist" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.237499 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.237563 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.237594 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.237732 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f9428cb3-4fdf-4b01-9368-28b413ecf82f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.237964 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8f74\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-kube-api-access-n8f74\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.238079 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.238112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.238136 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.238213 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.238249 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f9428cb3-4fdf-4b01-9368-28b413ecf82f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.238274 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340192 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f9428cb3-4fdf-4b01-9368-28b413ecf82f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340434 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8f74\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-kube-api-access-n8f74\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340514 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340544 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340614 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340682 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340760 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f9428cb3-4fdf-4b01-9368-28b413ecf82f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340792 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340892 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340955 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.341038 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.341527 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.341632 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.341766 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.342094 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.345161 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.346675 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f9428cb3-4fdf-4b01-9368-28b413ecf82f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.362360 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8f74\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-kube-api-access-n8f74\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.392698 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.392724 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f9428cb3-4fdf-4b01-9368-28b413ecf82f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.410104 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.438974 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.479269 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.944661 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0191a745-1fe2-4a1c-b007-96525ad39787","Type":"ContainerStarted","Data":"0396d45944ee78a263eb77715d857aabaf7376175957a0a8bd08d08293d88300"} Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.962481 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:27:08 crc kubenswrapper[4778]: W0318 09:27:08.972467 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9428cb3_4fdf_4b01_9368_28b413ecf82f.slice/crio-b3c9d4d88b999e63ea6962a213c1a22ebbce9fbc98a3be4a390c792ef9b67b0f WatchSource:0}: Error finding container b3c9d4d88b999e63ea6962a213c1a22ebbce9fbc98a3be4a390c792ef9b67b0f: Status 404 returned error can't find the container with id b3c9d4d88b999e63ea6962a213c1a22ebbce9fbc98a3be4a390c792ef9b67b0f Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.075759 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-vmtdf"] Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.077591 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.079360 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.100937 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-vmtdf"] Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.158104 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-dns-svc\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.158151 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.158173 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbl7b\" (UniqueName: \"kubernetes.io/projected/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-kube-api-access-dbl7b\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.158279 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.158316 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.158462 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-config\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.260304 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-dns-svc\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.260369 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.260415 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbl7b\" (UniqueName: \"kubernetes.io/projected/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-kube-api-access-dbl7b\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.260537 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.261576 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-dns-svc\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.261618 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.261722 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.261752 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.262279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.262673 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-config\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.263549 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-config\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.289898 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbl7b\" (UniqueName: \"kubernetes.io/projected/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-kube-api-access-dbl7b\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.537958 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.957932 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f9428cb3-4fdf-4b01-9368-28b413ecf82f","Type":"ContainerStarted","Data":"b3c9d4d88b999e63ea6962a213c1a22ebbce9fbc98a3be4a390c792ef9b67b0f"} Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.996466 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-vmtdf"] Mar 18 09:27:10 crc kubenswrapper[4778]: W0318 09:27:10.099881 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod974eb9b0_6a47_42f5_a01d_4abc2c7ea9db.slice/crio-73dd95d61a24a7d79ef30ee5ee880b92ec53e03c716ada9313ef6fb1715ea360 WatchSource:0}: Error finding container 73dd95d61a24a7d79ef30ee5ee880b92ec53e03c716ada9313ef6fb1715ea360: Status 404 returned error can't find the container with id 73dd95d61a24a7d79ef30ee5ee880b92ec53e03c716ada9313ef6fb1715ea360 Mar 18 09:27:10 crc kubenswrapper[4778]: I0318 09:27:10.974979 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f9428cb3-4fdf-4b01-9368-28b413ecf82f","Type":"ContainerStarted","Data":"8500c305081e16a099a488523dbfb64e89d7b5eed8698351067cc8182c26be12"} Mar 18 09:27:10 crc kubenswrapper[4778]: I0318 09:27:10.977170 4778 generic.go:334] "Generic (PLEG): container finished" podID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerID="bb6fddc74fb77836587c6c361f18d4c91b25b5b0b236c261f677eca1ad0a9af5" exitCode=0 Mar 18 09:27:10 crc kubenswrapper[4778]: I0318 09:27:10.977244 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" event={"ID":"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db","Type":"ContainerDied","Data":"bb6fddc74fb77836587c6c361f18d4c91b25b5b0b236c261f677eca1ad0a9af5"} Mar 18 09:27:10 crc kubenswrapper[4778]: I0318 09:27:10.977283 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" event={"ID":"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db","Type":"ContainerStarted","Data":"73dd95d61a24a7d79ef30ee5ee880b92ec53e03c716ada9313ef6fb1715ea360"} Mar 18 09:27:11 crc kubenswrapper[4778]: I0318 09:27:11.993480 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" event={"ID":"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db","Type":"ContainerStarted","Data":"0abbb14494f4856a4a551565083f0ffeae0b96885bc4209c98f8326f13f1e0ce"} Mar 18 09:27:12 crc kubenswrapper[4778]: I0318 09:27:12.046624 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" podStartSLOduration=3.046585733 podStartE2EDuration="3.046585733s" podCreationTimestamp="2026-03-18 09:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:27:12.019066753 +0000 UTC m=+1498.593811663" watchObservedRunningTime="2026-03-18 09:27:12.046585733 +0000 UTC m=+1498.621330613" Mar 18 09:27:13 crc kubenswrapper[4778]: I0318 09:27:13.003981 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.453650 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qb7km"] Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.456653 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.467503 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb7km"] Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.559115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-utilities\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.559648 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-catalog-content\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.559697 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m898n\" (UniqueName: \"kubernetes.io/projected/c9acf58d-8699-44d0-8478-bec0c033dbd1-kube-api-access-m898n\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.661624 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-utilities\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.662126 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-catalog-content\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.662271 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m898n\" (UniqueName: \"kubernetes.io/projected/c9acf58d-8699-44d0-8478-bec0c033dbd1-kube-api-access-m898n\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.662360 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-utilities\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.663018 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-catalog-content\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.686929 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m898n\" (UniqueName: \"kubernetes.io/projected/c9acf58d-8699-44d0-8478-bec0c033dbd1-kube-api-access-m898n\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.796655 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:17 crc kubenswrapper[4778]: I0318 09:27:17.289770 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb7km"] Mar 18 09:27:17 crc kubenswrapper[4778]: W0318 09:27:17.296138 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9acf58d_8699_44d0_8478_bec0c033dbd1.slice/crio-92d63e3e33616fc4f77cfb3a4148d6d69c68dfad28fb31cbacf24a70ff5a7286 WatchSource:0}: Error finding container 92d63e3e33616fc4f77cfb3a4148d6d69c68dfad28fb31cbacf24a70ff5a7286: Status 404 returned error can't find the container with id 92d63e3e33616fc4f77cfb3a4148d6d69c68dfad28fb31cbacf24a70ff5a7286 Mar 18 09:27:18 crc kubenswrapper[4778]: I0318 09:27:18.080352 4778 generic.go:334] "Generic (PLEG): container finished" podID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerID="335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55" exitCode=0 Mar 18 09:27:18 crc kubenswrapper[4778]: I0318 09:27:18.080475 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerDied","Data":"335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55"} Mar 18 09:27:18 crc kubenswrapper[4778]: I0318 09:27:18.080965 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerStarted","Data":"92d63e3e33616fc4f77cfb3a4148d6d69c68dfad28fb31cbacf24a70ff5a7286"} Mar 18 09:27:18 crc kubenswrapper[4778]: I0318 09:27:18.083897 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.090225 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerStarted","Data":"34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1"} Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.540380 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.626885 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-hzxbf"] Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.627182 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerName="dnsmasq-dns" containerID="cri-o://76b7c61aa2a7be4fd8a7e4d8e9e84698c3d85d0b2b4f2c826eef7fac6ce91214" gracePeriod=10 Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.774556 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-6qq8b"] Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.776781 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.783042 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-6qq8b"] Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.836569 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2gn6\" (UniqueName: \"kubernetes.io/projected/5778826c-0b71-4dad-af9c-c7ec7f04aa36-kube-api-access-q2gn6\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.836663 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.836727 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.836772 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.836786 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.836803 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-config\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.939668 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.939723 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.939753 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-config\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.939915 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2gn6\" (UniqueName: \"kubernetes.io/projected/5778826c-0b71-4dad-af9c-c7ec7f04aa36-kube-api-access-q2gn6\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.939999 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.940079 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.940781 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-config\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.940852 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.941041 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.941497 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.942039 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.968839 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2gn6\" (UniqueName: \"kubernetes.io/projected/5778826c-0b71-4dad-af9c-c7ec7f04aa36-kube-api-access-q2gn6\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.116611 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.168366 4778 generic.go:334] "Generic (PLEG): container finished" podID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerID="34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1" exitCode=0 Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.168456 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerDied","Data":"34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1"} Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.188588 4778 generic.go:334] "Generic (PLEG): container finished" podID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerID="76b7c61aa2a7be4fd8a7e4d8e9e84698c3d85d0b2b4f2c826eef7fac6ce91214" exitCode=0 Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.188639 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" event={"ID":"31b962b8-c7e1-495c-a52d-f4fb63e884ca","Type":"ContainerDied","Data":"76b7c61aa2a7be4fd8a7e4d8e9e84698c3d85d0b2b4f2c826eef7fac6ce91214"} Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.290982 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.354610 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-dns-svc\") pod \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.355171 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts5rq\" (UniqueName: \"kubernetes.io/projected/31b962b8-c7e1-495c-a52d-f4fb63e884ca-kube-api-access-ts5rq\") pod \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.355323 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-nb\") pod \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.355467 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-sb\") pod \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.355546 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-config\") pod \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.363142 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31b962b8-c7e1-495c-a52d-f4fb63e884ca-kube-api-access-ts5rq" (OuterVolumeSpecName: "kube-api-access-ts5rq") pod "31b962b8-c7e1-495c-a52d-f4fb63e884ca" (UID: "31b962b8-c7e1-495c-a52d-f4fb63e884ca"). InnerVolumeSpecName "kube-api-access-ts5rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.422905 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "31b962b8-c7e1-495c-a52d-f4fb63e884ca" (UID: "31b962b8-c7e1-495c-a52d-f4fb63e884ca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.425455 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "31b962b8-c7e1-495c-a52d-f4fb63e884ca" (UID: "31b962b8-c7e1-495c-a52d-f4fb63e884ca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.437702 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-config" (OuterVolumeSpecName: "config") pod "31b962b8-c7e1-495c-a52d-f4fb63e884ca" (UID: "31b962b8-c7e1-495c-a52d-f4fb63e884ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.443821 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31b962b8-c7e1-495c-a52d-f4fb63e884ca" (UID: "31b962b8-c7e1-495c-a52d-f4fb63e884ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.458030 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.458062 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.458074 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.458085 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.458096 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts5rq\" (UniqueName: \"kubernetes.io/projected/31b962b8-c7e1-495c-a52d-f4fb63e884ca-kube-api-access-ts5rq\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.743543 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-6qq8b"] Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.202763 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" event={"ID":"31b962b8-c7e1-495c-a52d-f4fb63e884ca","Type":"ContainerDied","Data":"f1b064a501bbcb63fd4dc86edbad3f121ced055deab0d7f76a0a08acbb649b7a"} Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.202811 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.203327 4778 scope.go:117] "RemoveContainer" containerID="76b7c61aa2a7be4fd8a7e4d8e9e84698c3d85d0b2b4f2c826eef7fac6ce91214" Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.205060 4778 generic.go:334] "Generic (PLEG): container finished" podID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerID="959408f0646af1976d94cc538a0b42d120c9b802bebf63b681de404e7a6632a0" exitCode=0 Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.205123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" event={"ID":"5778826c-0b71-4dad-af9c-c7ec7f04aa36","Type":"ContainerDied","Data":"959408f0646af1976d94cc538a0b42d120c9b802bebf63b681de404e7a6632a0"} Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.206297 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" event={"ID":"5778826c-0b71-4dad-af9c-c7ec7f04aa36","Type":"ContainerStarted","Data":"c164a0a62f55fc2358e88475eedc2afcbd5a8934bf23dc30fb39c81fc6c685f2"} Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.209480 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerStarted","Data":"3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990"} Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.232951 4778 scope.go:117] "RemoveContainer" containerID="bda48f5cd3722d61bee57d8aacf12b0b775a391dd2ad3e1c8999cc5bb624e15c" Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.246042 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qb7km" podStartSLOduration=2.692884403 podStartE2EDuration="5.246025759s" podCreationTimestamp="2026-03-18 09:27:16 +0000 UTC" firstStartedPulling="2026-03-18 09:27:18.083537555 +0000 UTC m=+1504.658282395" lastFinishedPulling="2026-03-18 09:27:20.636678911 +0000 UTC m=+1507.211423751" observedRunningTime="2026-03-18 09:27:21.236228172 +0000 UTC m=+1507.810973012" watchObservedRunningTime="2026-03-18 09:27:21.246025759 +0000 UTC m=+1507.820770599" Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.443561 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-hzxbf"] Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.455501 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-hzxbf"] Mar 18 09:27:22 crc kubenswrapper[4778]: I0318 09:27:22.200056 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" path="/var/lib/kubelet/pods/31b962b8-c7e1-495c-a52d-f4fb63e884ca/volumes" Mar 18 09:27:22 crc kubenswrapper[4778]: I0318 09:27:22.223534 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" event={"ID":"5778826c-0b71-4dad-af9c-c7ec7f04aa36","Type":"ContainerStarted","Data":"fd53655c1355ee9e239be226351d5c5537c6593f18ffcbefe53a7ddaf1ea2816"} Mar 18 09:27:22 crc kubenswrapper[4778]: I0318 09:27:22.225337 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:22 crc kubenswrapper[4778]: I0318 09:27:22.280990 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" podStartSLOduration=3.28096625 podStartE2EDuration="3.28096625s" podCreationTimestamp="2026-03-18 09:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:27:22.24573298 +0000 UTC m=+1508.820477820" watchObservedRunningTime="2026-03-18 09:27:22.28096625 +0000 UTC m=+1508.855711080" Mar 18 09:27:26 crc kubenswrapper[4778]: I0318 09:27:26.797149 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:26 crc kubenswrapper[4778]: I0318 09:27:26.797737 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:26 crc kubenswrapper[4778]: I0318 09:27:26.857477 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:27 crc kubenswrapper[4778]: I0318 09:27:27.362112 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:27 crc kubenswrapper[4778]: I0318 09:27:27.426883 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb7km"] Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.299796 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qb7km" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="registry-server" containerID="cri-o://3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990" gracePeriod=2 Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.841391 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.955267 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-utilities\") pod \"c9acf58d-8699-44d0-8478-bec0c033dbd1\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.955386 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m898n\" (UniqueName: \"kubernetes.io/projected/c9acf58d-8699-44d0-8478-bec0c033dbd1-kube-api-access-m898n\") pod \"c9acf58d-8699-44d0-8478-bec0c033dbd1\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.955476 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-catalog-content\") pod \"c9acf58d-8699-44d0-8478-bec0c033dbd1\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.956259 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-utilities" (OuterVolumeSpecName: "utilities") pod "c9acf58d-8699-44d0-8478-bec0c033dbd1" (UID: "c9acf58d-8699-44d0-8478-bec0c033dbd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.959811 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9acf58d-8699-44d0-8478-bec0c033dbd1-kube-api-access-m898n" (OuterVolumeSpecName: "kube-api-access-m898n") pod "c9acf58d-8699-44d0-8478-bec0c033dbd1" (UID: "c9acf58d-8699-44d0-8478-bec0c033dbd1"). InnerVolumeSpecName "kube-api-access-m898n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.992185 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9acf58d-8699-44d0-8478-bec0c033dbd1" (UID: "c9acf58d-8699-44d0-8478-bec0c033dbd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.057754 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.057807 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m898n\" (UniqueName: \"kubernetes.io/projected/c9acf58d-8699-44d0-8478-bec0c033dbd1-kube-api-access-m898n\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.057829 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.119942 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.185003 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-vmtdf"] Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.185282 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerName="dnsmasq-dns" containerID="cri-o://0abbb14494f4856a4a551565083f0ffeae0b96885bc4209c98f8326f13f1e0ce" gracePeriod=10 Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.330795 4778 generic.go:334] "Generic (PLEG): container finished" podID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerID="3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990" exitCode=0 Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.330922 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerDied","Data":"3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990"} Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.330953 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerDied","Data":"92d63e3e33616fc4f77cfb3a4148d6d69c68dfad28fb31cbacf24a70ff5a7286"} Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.330971 4778 scope.go:117] "RemoveContainer" containerID="3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.331136 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.336895 4778 generic.go:334] "Generic (PLEG): container finished" podID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerID="0abbb14494f4856a4a551565083f0ffeae0b96885bc4209c98f8326f13f1e0ce" exitCode=0 Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.336957 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" event={"ID":"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db","Type":"ContainerDied","Data":"0abbb14494f4856a4a551565083f0ffeae0b96885bc4209c98f8326f13f1e0ce"} Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.375492 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb7km"] Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.378494 4778 scope.go:117] "RemoveContainer" containerID="34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.397183 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb7km"] Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.422626 4778 scope.go:117] "RemoveContainer" containerID="335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.476478 4778 scope.go:117] "RemoveContainer" containerID="3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990" Mar 18 09:27:30 crc kubenswrapper[4778]: E0318 09:27:30.476976 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990\": container with ID starting with 3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990 not found: ID does not exist" containerID="3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.477005 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990"} err="failed to get container status \"3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990\": rpc error: code = NotFound desc = could not find container \"3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990\": container with ID starting with 3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990 not found: ID does not exist" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.477025 4778 scope.go:117] "RemoveContainer" containerID="34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1" Mar 18 09:27:30 crc kubenswrapper[4778]: E0318 09:27:30.477272 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1\": container with ID starting with 34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1 not found: ID does not exist" containerID="34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.477289 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1"} err="failed to get container status \"34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1\": rpc error: code = NotFound desc = could not find container \"34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1\": container with ID starting with 34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1 not found: ID does not exist" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.477301 4778 scope.go:117] "RemoveContainer" containerID="335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55" Mar 18 09:27:30 crc kubenswrapper[4778]: E0318 09:27:30.477542 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55\": container with ID starting with 335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55 not found: ID does not exist" containerID="335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.477563 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55"} err="failed to get container status \"335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55\": rpc error: code = NotFound desc = could not find container \"335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55\": container with ID starting with 335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55 not found: ID does not exist" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.730563 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.773841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-dns-svc\") pod \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.773891 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbl7b\" (UniqueName: \"kubernetes.io/projected/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-kube-api-access-dbl7b\") pod \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.774044 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-nb\") pod \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.774085 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-sb\") pod \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.774125 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-config\") pod \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.774144 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-openstack-edpm-ipam\") pod \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.791469 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-kube-api-access-dbl7b" (OuterVolumeSpecName: "kube-api-access-dbl7b") pod "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" (UID: "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db"). InnerVolumeSpecName "kube-api-access-dbl7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.826685 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" (UID: "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.835289 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" (UID: "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.836007 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" (UID: "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.853758 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" (UID: "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.860322 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-config" (OuterVolumeSpecName: "config") pod "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" (UID: "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.876020 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.876057 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.876071 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.876082 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.876093 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.876103 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbl7b\" (UniqueName: \"kubernetes.io/projected/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-kube-api-access-dbl7b\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:31 crc kubenswrapper[4778]: I0318 09:27:31.348284 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" event={"ID":"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db","Type":"ContainerDied","Data":"73dd95d61a24a7d79ef30ee5ee880b92ec53e03c716ada9313ef6fb1715ea360"} Mar 18 09:27:31 crc kubenswrapper[4778]: I0318 09:27:31.348350 4778 scope.go:117] "RemoveContainer" containerID="0abbb14494f4856a4a551565083f0ffeae0b96885bc4209c98f8326f13f1e0ce" Mar 18 09:27:31 crc kubenswrapper[4778]: I0318 09:27:31.348386 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:31 crc kubenswrapper[4778]: I0318 09:27:31.369733 4778 scope.go:117] "RemoveContainer" containerID="bb6fddc74fb77836587c6c361f18d4c91b25b5b0b236c261f677eca1ad0a9af5" Mar 18 09:27:31 crc kubenswrapper[4778]: I0318 09:27:31.395178 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-vmtdf"] Mar 18 09:27:31 crc kubenswrapper[4778]: I0318 09:27:31.402777 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-vmtdf"] Mar 18 09:27:32 crc kubenswrapper[4778]: I0318 09:27:32.196905 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" path="/var/lib/kubelet/pods/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db/volumes" Mar 18 09:27:32 crc kubenswrapper[4778]: I0318 09:27:32.197776 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" path="/var/lib/kubelet/pods/c9acf58d-8699-44d0-8478-bec0c033dbd1/volumes" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.197694 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p"] Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198672 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="extract-utilities" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198686 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="extract-utilities" Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198699 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="extract-content" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198706 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="extract-content" Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198722 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerName="init" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198728 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerName="init" Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198747 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="registry-server" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198752 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="registry-server" Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198766 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerName="init" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198772 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerName="init" Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198782 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerName="dnsmasq-dns" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198788 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerName="dnsmasq-dns" Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198803 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerName="dnsmasq-dns" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198809 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerName="dnsmasq-dns" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198977 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="registry-server" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.199007 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerName="dnsmasq-dns" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.199018 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerName="dnsmasq-dns" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.199681 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.201766 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.203054 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.203455 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.204442 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.210519 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p"] Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.304815 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqc9n\" (UniqueName: \"kubernetes.io/projected/154a89df-1c2e-4f86-bbf3-827d6443c04a-kube-api-access-zqc9n\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.304889 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.305303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.305581 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.407654 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.407744 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.407844 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqc9n\" (UniqueName: \"kubernetes.io/projected/154a89df-1c2e-4f86-bbf3-827d6443c04a-kube-api-access-zqc9n\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.407889 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.416867 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.416886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.417942 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.432607 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqc9n\" (UniqueName: \"kubernetes.io/projected/154a89df-1c2e-4f86-bbf3-827d6443c04a-kube-api-access-zqc9n\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.538116 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:41 crc kubenswrapper[4778]: I0318 09:27:41.136745 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p"] Mar 18 09:27:41 crc kubenswrapper[4778]: I0318 09:27:41.465547 4778 generic.go:334] "Generic (PLEG): container finished" podID="0191a745-1fe2-4a1c-b007-96525ad39787" containerID="0396d45944ee78a263eb77715d857aabaf7376175957a0a8bd08d08293d88300" exitCode=0 Mar 18 09:27:41 crc kubenswrapper[4778]: I0318 09:27:41.465624 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0191a745-1fe2-4a1c-b007-96525ad39787","Type":"ContainerDied","Data":"0396d45944ee78a263eb77715d857aabaf7376175957a0a8bd08d08293d88300"} Mar 18 09:27:41 crc kubenswrapper[4778]: I0318 09:27:41.467737 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" event={"ID":"154a89df-1c2e-4f86-bbf3-827d6443c04a","Type":"ContainerStarted","Data":"a6279dd667e196403d3c5c22ceabeb3e0818bd4665ef38b4085c12358feff855"} Mar 18 09:27:43 crc kubenswrapper[4778]: I0318 09:27:42.479366 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0191a745-1fe2-4a1c-b007-96525ad39787","Type":"ContainerStarted","Data":"f3b54915bc8b60b3c34537d45c4915fceb3c19874748837f3a8ee823c2999484"} Mar 18 09:27:43 crc kubenswrapper[4778]: I0318 09:27:42.481109 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 09:27:43 crc kubenswrapper[4778]: I0318 09:27:42.526479 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.526457451 podStartE2EDuration="37.526457451s" podCreationTimestamp="2026-03-18 09:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:27:42.51690295 +0000 UTC m=+1529.091647810" watchObservedRunningTime="2026-03-18 09:27:42.526457451 +0000 UTC m=+1529.101202291" Mar 18 09:27:44 crc kubenswrapper[4778]: I0318 09:27:44.501022 4778 generic.go:334] "Generic (PLEG): container finished" podID="f9428cb3-4fdf-4b01-9368-28b413ecf82f" containerID="8500c305081e16a099a488523dbfb64e89d7b5eed8698351067cc8182c26be12" exitCode=0 Mar 18 09:27:44 crc kubenswrapper[4778]: I0318 09:27:44.501090 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f9428cb3-4fdf-4b01-9368-28b413ecf82f","Type":"ContainerDied","Data":"8500c305081e16a099a488523dbfb64e89d7b5eed8698351067cc8182c26be12"} Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.591128 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f9428cb3-4fdf-4b01-9368-28b413ecf82f","Type":"ContainerStarted","Data":"326d191845db7deb2003a53c8c81cda883199fe6e269581c5a42595124899cb3"} Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.594218 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.595806 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" event={"ID":"154a89df-1c2e-4f86-bbf3-827d6443c04a","Type":"ContainerStarted","Data":"a0e4df9a818fec5131c555c760ba72656483292d6411393207bbb36928547cc0"} Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.630346 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.630330972 podStartE2EDuration="43.630330972s" podCreationTimestamp="2026-03-18 09:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:27:51.628709569 +0000 UTC m=+1538.203454419" watchObservedRunningTime="2026-03-18 09:27:51.630330972 +0000 UTC m=+1538.205075812" Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.670832 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" podStartSLOduration=1.858760423 podStartE2EDuration="11.670807335s" podCreationTimestamp="2026-03-18 09:27:40 +0000 UTC" firstStartedPulling="2026-03-18 09:27:41.160726069 +0000 UTC m=+1527.735470919" lastFinishedPulling="2026-03-18 09:27:50.972772991 +0000 UTC m=+1537.547517831" observedRunningTime="2026-03-18 09:27:51.662719295 +0000 UTC m=+1538.237464145" watchObservedRunningTime="2026-03-18 09:27:51.670807335 +0000 UTC m=+1538.245552175" Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.875230 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6rqfc"] Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.877586 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.893977 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6rqfc"] Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.044805 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c267m\" (UniqueName: \"kubernetes.io/projected/60a043ee-4047-4c6e-9c4d-eaa8272648f6-kube-api-access-c267m\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.044861 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-utilities\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.044888 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-catalog-content\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.146277 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c267m\" (UniqueName: \"kubernetes.io/projected/60a043ee-4047-4c6e-9c4d-eaa8272648f6-kube-api-access-c267m\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.146342 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-utilities\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.146375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-catalog-content\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.147001 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-catalog-content\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.147551 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-utilities\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.167075 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c267m\" (UniqueName: \"kubernetes.io/projected/60a043ee-4047-4c6e-9c4d-eaa8272648f6-kube-api-access-c267m\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.240331 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.772220 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6rqfc"] Mar 18 09:27:53 crc kubenswrapper[4778]: I0318 09:27:53.615072 4778 generic.go:334] "Generic (PLEG): container finished" podID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerID="368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62" exitCode=0 Mar 18 09:27:53 crc kubenswrapper[4778]: I0318 09:27:53.615422 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rqfc" event={"ID":"60a043ee-4047-4c6e-9c4d-eaa8272648f6","Type":"ContainerDied","Data":"368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62"} Mar 18 09:27:53 crc kubenswrapper[4778]: I0318 09:27:53.615461 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rqfc" event={"ID":"60a043ee-4047-4c6e-9c4d-eaa8272648f6","Type":"ContainerStarted","Data":"90e09b0c12b6393363e62abca4fa79678465054f2984b57ee714525b9a8b228d"} Mar 18 09:27:55 crc kubenswrapper[4778]: I0318 09:27:55.650260 4778 generic.go:334] "Generic (PLEG): container finished" podID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerID="04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6" exitCode=0 Mar 18 09:27:55 crc kubenswrapper[4778]: I0318 09:27:55.650571 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rqfc" event={"ID":"60a043ee-4047-4c6e-9c4d-eaa8272648f6","Type":"ContainerDied","Data":"04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6"} Mar 18 09:27:56 crc kubenswrapper[4778]: I0318 09:27:56.363439 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 09:27:56 crc kubenswrapper[4778]: I0318 09:27:56.660752 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rqfc" event={"ID":"60a043ee-4047-4c6e-9c4d-eaa8272648f6","Type":"ContainerStarted","Data":"10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e"} Mar 18 09:27:56 crc kubenswrapper[4778]: I0318 09:27:56.708524 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6rqfc" podStartSLOduration=3.212776716 podStartE2EDuration="5.708504548s" podCreationTimestamp="2026-03-18 09:27:51 +0000 UTC" firstStartedPulling="2026-03-18 09:27:53.617263365 +0000 UTC m=+1540.192008215" lastFinishedPulling="2026-03-18 09:27:56.112991207 +0000 UTC m=+1542.687736047" observedRunningTime="2026-03-18 09:27:56.706075402 +0000 UTC m=+1543.280820242" watchObservedRunningTime="2026-03-18 09:27:56.708504548 +0000 UTC m=+1543.283249388" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.157880 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563768-4z27c"] Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.162896 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.164775 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.164999 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.166991 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.170035 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563768-4z27c"] Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.296100 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht6bc\" (UniqueName: \"kubernetes.io/projected/ec16e337-91fc-40c5-b3d4-87b5243e5a73-kube-api-access-ht6bc\") pod \"auto-csr-approver-29563768-4z27c\" (UID: \"ec16e337-91fc-40c5-b3d4-87b5243e5a73\") " pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.398824 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht6bc\" (UniqueName: \"kubernetes.io/projected/ec16e337-91fc-40c5-b3d4-87b5243e5a73-kube-api-access-ht6bc\") pod \"auto-csr-approver-29563768-4z27c\" (UID: \"ec16e337-91fc-40c5-b3d4-87b5243e5a73\") " pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.420856 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht6bc\" (UniqueName: \"kubernetes.io/projected/ec16e337-91fc-40c5-b3d4-87b5243e5a73-kube-api-access-ht6bc\") pod \"auto-csr-approver-29563768-4z27c\" (UID: \"ec16e337-91fc-40c5-b3d4-87b5243e5a73\") " pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.484086 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:00 crc kubenswrapper[4778]: W0318 09:28:00.975250 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec16e337_91fc_40c5_b3d4_87b5243e5a73.slice/crio-11eb68ad70b16e075622c19b845c0f9b2eb54e4b6f5459e3950e9528763c3016 WatchSource:0}: Error finding container 11eb68ad70b16e075622c19b845c0f9b2eb54e4b6f5459e3950e9528763c3016: Status 404 returned error can't find the container with id 11eb68ad70b16e075622c19b845c0f9b2eb54e4b6f5459e3950e9528763c3016 Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.982014 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563768-4z27c"] Mar 18 09:28:01 crc kubenswrapper[4778]: I0318 09:28:01.711495 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563768-4z27c" event={"ID":"ec16e337-91fc-40c5-b3d4-87b5243e5a73","Type":"ContainerStarted","Data":"11eb68ad70b16e075622c19b845c0f9b2eb54e4b6f5459e3950e9528763c3016"} Mar 18 09:28:02 crc kubenswrapper[4778]: I0318 09:28:02.241430 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:28:02 crc kubenswrapper[4778]: I0318 09:28:02.241842 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:28:02 crc kubenswrapper[4778]: I0318 09:28:02.721350 4778 generic.go:334] "Generic (PLEG): container finished" podID="154a89df-1c2e-4f86-bbf3-827d6443c04a" containerID="a0e4df9a818fec5131c555c760ba72656483292d6411393207bbb36928547cc0" exitCode=0 Mar 18 09:28:02 crc kubenswrapper[4778]: I0318 09:28:02.721398 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" event={"ID":"154a89df-1c2e-4f86-bbf3-827d6443c04a","Type":"ContainerDied","Data":"a0e4df9a818fec5131c555c760ba72656483292d6411393207bbb36928547cc0"} Mar 18 09:28:02 crc kubenswrapper[4778]: I0318 09:28:02.723005 4778 generic.go:334] "Generic (PLEG): container finished" podID="ec16e337-91fc-40c5-b3d4-87b5243e5a73" containerID="201dd8b3293289bdbf9f29c3749f98499b07694d8d80e9df99ed62c3075ec93f" exitCode=0 Mar 18 09:28:02 crc kubenswrapper[4778]: I0318 09:28:02.723035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563768-4z27c" event={"ID":"ec16e337-91fc-40c5-b3d4-87b5243e5a73","Type":"ContainerDied","Data":"201dd8b3293289bdbf9f29c3749f98499b07694d8d80e9df99ed62c3075ec93f"} Mar 18 09:28:03 crc kubenswrapper[4778]: I0318 09:28:03.300192 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6rqfc" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="registry-server" probeResult="failure" output=< Mar 18 09:28:03 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:28:03 crc kubenswrapper[4778]: > Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.085589 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.181729 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht6bc\" (UniqueName: \"kubernetes.io/projected/ec16e337-91fc-40c5-b3d4-87b5243e5a73-kube-api-access-ht6bc\") pod \"ec16e337-91fc-40c5-b3d4-87b5243e5a73\" (UID: \"ec16e337-91fc-40c5-b3d4-87b5243e5a73\") " Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.188164 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec16e337-91fc-40c5-b3d4-87b5243e5a73-kube-api-access-ht6bc" (OuterVolumeSpecName: "kube-api-access-ht6bc") pod "ec16e337-91fc-40c5-b3d4-87b5243e5a73" (UID: "ec16e337-91fc-40c5-b3d4-87b5243e5a73"). InnerVolumeSpecName "kube-api-access-ht6bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.282349 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.283745 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht6bc\" (UniqueName: \"kubernetes.io/projected/ec16e337-91fc-40c5-b3d4-87b5243e5a73-kube-api-access-ht6bc\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.385494 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqc9n\" (UniqueName: \"kubernetes.io/projected/154a89df-1c2e-4f86-bbf3-827d6443c04a-kube-api-access-zqc9n\") pod \"154a89df-1c2e-4f86-bbf3-827d6443c04a\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.385778 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-repo-setup-combined-ca-bundle\") pod \"154a89df-1c2e-4f86-bbf3-827d6443c04a\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.385858 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-ssh-key-openstack-edpm-ipam\") pod \"154a89df-1c2e-4f86-bbf3-827d6443c04a\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.385973 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-inventory\") pod \"154a89df-1c2e-4f86-bbf3-827d6443c04a\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.389395 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/154a89df-1c2e-4f86-bbf3-827d6443c04a-kube-api-access-zqc9n" (OuterVolumeSpecName: "kube-api-access-zqc9n") pod "154a89df-1c2e-4f86-bbf3-827d6443c04a" (UID: "154a89df-1c2e-4f86-bbf3-827d6443c04a"). InnerVolumeSpecName "kube-api-access-zqc9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.390168 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "154a89df-1c2e-4f86-bbf3-827d6443c04a" (UID: "154a89df-1c2e-4f86-bbf3-827d6443c04a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.409540 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "154a89df-1c2e-4f86-bbf3-827d6443c04a" (UID: "154a89df-1c2e-4f86-bbf3-827d6443c04a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.415954 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-inventory" (OuterVolumeSpecName: "inventory") pod "154a89df-1c2e-4f86-bbf3-827d6443c04a" (UID: "154a89df-1c2e-4f86-bbf3-827d6443c04a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.488713 4778 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.488750 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.488764 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.488776 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqc9n\" (UniqueName: \"kubernetes.io/projected/154a89df-1c2e-4f86-bbf3-827d6443c04a-kube-api-access-zqc9n\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.743004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563768-4z27c" event={"ID":"ec16e337-91fc-40c5-b3d4-87b5243e5a73","Type":"ContainerDied","Data":"11eb68ad70b16e075622c19b845c0f9b2eb54e4b6f5459e3950e9528763c3016"} Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.743046 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.743065 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11eb68ad70b16e075622c19b845c0f9b2eb54e4b6f5459e3950e9528763c3016" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.745377 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" event={"ID":"154a89df-1c2e-4f86-bbf3-827d6443c04a","Type":"ContainerDied","Data":"a6279dd667e196403d3c5c22ceabeb3e0818bd4665ef38b4085c12358feff855"} Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.745431 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6279dd667e196403d3c5c22ceabeb3e0818bd4665ef38b4085c12358feff855" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.745462 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.857190 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z"] Mar 18 09:28:04 crc kubenswrapper[4778]: E0318 09:28:04.857693 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154a89df-1c2e-4f86-bbf3-827d6443c04a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.857716 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="154a89df-1c2e-4f86-bbf3-827d6443c04a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 09:28:04 crc kubenswrapper[4778]: E0318 09:28:04.857742 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec16e337-91fc-40c5-b3d4-87b5243e5a73" containerName="oc" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.857750 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec16e337-91fc-40c5-b3d4-87b5243e5a73" containerName="oc" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.857950 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec16e337-91fc-40c5-b3d4-87b5243e5a73" containerName="oc" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.857981 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="154a89df-1c2e-4f86-bbf3-827d6443c04a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.858716 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.862502 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.862709 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.863008 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.863337 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.885489 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z"] Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.912485 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.912611 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.912645 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.912728 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbscl\" (UniqueName: \"kubernetes.io/projected/b989f767-d1ba-49fe-aebb-6aef120e0e22-kube-api-access-kbscl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.014768 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbscl\" (UniqueName: \"kubernetes.io/projected/b989f767-d1ba-49fe-aebb-6aef120e0e22-kube-api-access-kbscl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.014898 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.014938 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.014957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.018871 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.018969 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.022951 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.038134 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbscl\" (UniqueName: \"kubernetes.io/projected/b989f767-d1ba-49fe-aebb-6aef120e0e22-kube-api-access-kbscl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.162816 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563762-nk868"] Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.170812 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563762-nk868"] Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.176996 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.730699 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z"] Mar 18 09:28:05 crc kubenswrapper[4778]: W0318 09:28:05.739066 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb989f767_d1ba_49fe_aebb_6aef120e0e22.slice/crio-b5bc788ebd6b1883d8a8fc2b7f9f391272c9ac7329e6e4d8ed1d79fca68aaffa WatchSource:0}: Error finding container b5bc788ebd6b1883d8a8fc2b7f9f391272c9ac7329e6e4d8ed1d79fca68aaffa: Status 404 returned error can't find the container with id b5bc788ebd6b1883d8a8fc2b7f9f391272c9ac7329e6e4d8ed1d79fca68aaffa Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.757553 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" event={"ID":"b989f767-d1ba-49fe-aebb-6aef120e0e22","Type":"ContainerStarted","Data":"b5bc788ebd6b1883d8a8fc2b7f9f391272c9ac7329e6e4d8ed1d79fca68aaffa"} Mar 18 09:28:06 crc kubenswrapper[4778]: I0318 09:28:06.197439 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bcf145e-ae6a-4674-9f79-b6486ec2fa9d" path="/var/lib/kubelet/pods/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d/volumes" Mar 18 09:28:06 crc kubenswrapper[4778]: I0318 09:28:06.770140 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" event={"ID":"b989f767-d1ba-49fe-aebb-6aef120e0e22","Type":"ContainerStarted","Data":"df748c22cbd9cfd719213cf439a446ed8f2c405ec832bdc5f38d5aacebbce9a9"} Mar 18 09:28:06 crc kubenswrapper[4778]: I0318 09:28:06.796372 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" podStartSLOduration=2.225744699 podStartE2EDuration="2.796350902s" podCreationTimestamp="2026-03-18 09:28:04 +0000 UTC" firstStartedPulling="2026-03-18 09:28:05.743799052 +0000 UTC m=+1552.318543932" lastFinishedPulling="2026-03-18 09:28:06.314405295 +0000 UTC m=+1552.889150135" observedRunningTime="2026-03-18 09:28:06.794311347 +0000 UTC m=+1553.369056207" watchObservedRunningTime="2026-03-18 09:28:06.796350902 +0000 UTC m=+1553.371095742" Mar 18 09:28:08 crc kubenswrapper[4778]: I0318 09:28:08.483440 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:28:12 crc kubenswrapper[4778]: I0318 09:28:12.314860 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:28:12 crc kubenswrapper[4778]: I0318 09:28:12.388703 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:28:12 crc kubenswrapper[4778]: I0318 09:28:12.570725 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6rqfc"] Mar 18 09:28:13 crc kubenswrapper[4778]: I0318 09:28:13.842372 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6rqfc" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="registry-server" containerID="cri-o://10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e" gracePeriod=2 Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.328798 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.434757 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-catalog-content\") pod \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.434899 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c267m\" (UniqueName: \"kubernetes.io/projected/60a043ee-4047-4c6e-9c4d-eaa8272648f6-kube-api-access-c267m\") pod \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.434986 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-utilities\") pod \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.441390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-utilities" (OuterVolumeSpecName: "utilities") pod "60a043ee-4047-4c6e-9c4d-eaa8272648f6" (UID: "60a043ee-4047-4c6e-9c4d-eaa8272648f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.441700 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a043ee-4047-4c6e-9c4d-eaa8272648f6-kube-api-access-c267m" (OuterVolumeSpecName: "kube-api-access-c267m") pod "60a043ee-4047-4c6e-9c4d-eaa8272648f6" (UID: "60a043ee-4047-4c6e-9c4d-eaa8272648f6"). InnerVolumeSpecName "kube-api-access-c267m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.521978 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60a043ee-4047-4c6e-9c4d-eaa8272648f6" (UID: "60a043ee-4047-4c6e-9c4d-eaa8272648f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.536770 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c267m\" (UniqueName: \"kubernetes.io/projected/60a043ee-4047-4c6e-9c4d-eaa8272648f6-kube-api-access-c267m\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.536811 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.536823 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.857603 4778 generic.go:334] "Generic (PLEG): container finished" podID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerID="10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e" exitCode=0 Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.857725 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rqfc" event={"ID":"60a043ee-4047-4c6e-9c4d-eaa8272648f6","Type":"ContainerDied","Data":"10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e"} Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.857755 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.857791 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rqfc" event={"ID":"60a043ee-4047-4c6e-9c4d-eaa8272648f6","Type":"ContainerDied","Data":"90e09b0c12b6393363e62abca4fa79678465054f2984b57ee714525b9a8b228d"} Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.857814 4778 scope.go:117] "RemoveContainer" containerID="10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.886651 4778 scope.go:117] "RemoveContainer" containerID="04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.906410 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6rqfc"] Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.918014 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6rqfc"] Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.941978 4778 scope.go:117] "RemoveContainer" containerID="368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.965891 4778 scope.go:117] "RemoveContainer" containerID="10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e" Mar 18 09:28:14 crc kubenswrapper[4778]: E0318 09:28:14.966284 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e\": container with ID starting with 10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e not found: ID does not exist" containerID="10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.966309 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e"} err="failed to get container status \"10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e\": rpc error: code = NotFound desc = could not find container \"10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e\": container with ID starting with 10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e not found: ID does not exist" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.966327 4778 scope.go:117] "RemoveContainer" containerID="04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6" Mar 18 09:28:14 crc kubenswrapper[4778]: E0318 09:28:14.966885 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6\": container with ID starting with 04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6 not found: ID does not exist" containerID="04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.967048 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6"} err="failed to get container status \"04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6\": rpc error: code = NotFound desc = could not find container \"04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6\": container with ID starting with 04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6 not found: ID does not exist" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.967113 4778 scope.go:117] "RemoveContainer" containerID="368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62" Mar 18 09:28:14 crc kubenswrapper[4778]: E0318 09:28:14.967417 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62\": container with ID starting with 368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62 not found: ID does not exist" containerID="368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.967440 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62"} err="failed to get container status \"368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62\": rpc error: code = NotFound desc = could not find container \"368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62\": container with ID starting with 368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62 not found: ID does not exist" Mar 18 09:28:16 crc kubenswrapper[4778]: I0318 09:28:16.203086 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" path="/var/lib/kubelet/pods/60a043ee-4047-4c6e-9c4d-eaa8272648f6/volumes" Mar 18 09:28:28 crc kubenswrapper[4778]: I0318 09:28:28.549713 4778 scope.go:117] "RemoveContainer" containerID="d4dc8e8b710d4699b5bf32fb7126e48abd2e3887409f25b1985152911c6485a4" Mar 18 09:28:28 crc kubenswrapper[4778]: I0318 09:28:28.577413 4778 scope.go:117] "RemoveContainer" containerID="1fd591e2b660a21c69fd30f836286a21f17c533f4da08c01dd7acbea44c0d5f9" Mar 18 09:28:28 crc kubenswrapper[4778]: I0318 09:28:28.647241 4778 scope.go:117] "RemoveContainer" containerID="a91ae12327b49c9ef11425b6654b97316a81c22c90fc2a91ca50368382d18bdb" Mar 18 09:28:28 crc kubenswrapper[4778]: I0318 09:28:28.723821 4778 scope.go:117] "RemoveContainer" containerID="6d2d7a8a11e12ed4124ccde5834d93c0bc78bbc0a9c88b791847bb709dbbb116" Mar 18 09:28:30 crc kubenswrapper[4778]: I0318 09:28:30.147867 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:28:30 crc kubenswrapper[4778]: I0318 09:28:30.148414 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:29:00 crc kubenswrapper[4778]: I0318 09:29:00.147549 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:29:00 crc kubenswrapper[4778]: I0318 09:29:00.148254 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.046931 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tpgxl"] Mar 18 09:29:04 crc kubenswrapper[4778]: E0318 09:29:04.048673 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="extract-utilities" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.048701 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="extract-utilities" Mar 18 09:29:04 crc kubenswrapper[4778]: E0318 09:29:04.048741 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="extract-content" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.048753 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="extract-content" Mar 18 09:29:04 crc kubenswrapper[4778]: E0318 09:29:04.048775 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="registry-server" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.048786 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="registry-server" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.049062 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="registry-server" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.051086 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.077509 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpgxl"] Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.156680 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-catalog-content\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.156724 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md686\" (UniqueName: \"kubernetes.io/projected/418d5e86-72b6-4030-b9e6-9d9402174c5c-kube-api-access-md686\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.156758 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-utilities\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.258283 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-catalog-content\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.258322 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md686\" (UniqueName: \"kubernetes.io/projected/418d5e86-72b6-4030-b9e6-9d9402174c5c-kube-api-access-md686\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.258350 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-utilities\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.258792 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-utilities\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.259091 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-catalog-content\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.278824 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md686\" (UniqueName: \"kubernetes.io/projected/418d5e86-72b6-4030-b9e6-9d9402174c5c-kube-api-access-md686\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.391013 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.867903 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpgxl"] Mar 18 09:29:05 crc kubenswrapper[4778]: I0318 09:29:05.448923 4778 generic.go:334] "Generic (PLEG): container finished" podID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerID="cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc" exitCode=0 Mar 18 09:29:05 crc kubenswrapper[4778]: I0318 09:29:05.449050 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerDied","Data":"cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc"} Mar 18 09:29:05 crc kubenswrapper[4778]: I0318 09:29:05.449326 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerStarted","Data":"46e2104e09d595fc9eab76f4e7ebdca7f02555a48577717019a50c200d0c3244"} Mar 18 09:29:06 crc kubenswrapper[4778]: I0318 09:29:06.460944 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerStarted","Data":"14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380"} Mar 18 09:29:09 crc kubenswrapper[4778]: I0318 09:29:09.513523 4778 generic.go:334] "Generic (PLEG): container finished" podID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerID="14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380" exitCode=0 Mar 18 09:29:09 crc kubenswrapper[4778]: I0318 09:29:09.513592 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerDied","Data":"14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380"} Mar 18 09:29:10 crc kubenswrapper[4778]: I0318 09:29:10.527574 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerStarted","Data":"f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8"} Mar 18 09:29:10 crc kubenswrapper[4778]: I0318 09:29:10.555459 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tpgxl" podStartSLOduration=2.052914615 podStartE2EDuration="6.555432121s" podCreationTimestamp="2026-03-18 09:29:04 +0000 UTC" firstStartedPulling="2026-03-18 09:29:05.451231686 +0000 UTC m=+1612.025976536" lastFinishedPulling="2026-03-18 09:29:09.953749202 +0000 UTC m=+1616.528494042" observedRunningTime="2026-03-18 09:29:10.55426695 +0000 UTC m=+1617.129011790" watchObservedRunningTime="2026-03-18 09:29:10.555432121 +0000 UTC m=+1617.130176981" Mar 18 09:29:14 crc kubenswrapper[4778]: I0318 09:29:14.391756 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:14 crc kubenswrapper[4778]: I0318 09:29:14.393039 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:15 crc kubenswrapper[4778]: I0318 09:29:15.465098 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tpgxl" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="registry-server" probeResult="failure" output=< Mar 18 09:29:15 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:29:15 crc kubenswrapper[4778]: > Mar 18 09:29:25 crc kubenswrapper[4778]: I0318 09:29:25.439924 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tpgxl" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="registry-server" probeResult="failure" output=< Mar 18 09:29:25 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:29:25 crc kubenswrapper[4778]: > Mar 18 09:29:28 crc kubenswrapper[4778]: I0318 09:29:28.914174 4778 scope.go:117] "RemoveContainer" containerID="b11383263b8711730419b8679c1b55038dd677778bfced742aebd89591fb64cf" Mar 18 09:29:28 crc kubenswrapper[4778]: I0318 09:29:28.982158 4778 scope.go:117] "RemoveContainer" containerID="14bdd0eaf1dbaf2dd412641a551329bb6a350b27bbb5a0fca3b8cd2009883565" Mar 18 09:29:29 crc kubenswrapper[4778]: I0318 09:29:29.023836 4778 scope.go:117] "RemoveContainer" containerID="a2a49acb877ac3d5291587410f4bdad35a27b8e7dc386fa78d21020a20cbe78c" Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.147407 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.147854 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.147915 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.149026 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.149132 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" gracePeriod=600 Mar 18 09:29:30 crc kubenswrapper[4778]: E0318 09:29:30.277343 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.738359 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" exitCode=0 Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.738416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492"} Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.738469 4778 scope.go:117] "RemoveContainer" containerID="84af870879787d3b66c88494739078983637b86ff0a5bba3998c98c4487f8853" Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.742781 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:29:30 crc kubenswrapper[4778]: E0318 09:29:30.744241 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:29:34 crc kubenswrapper[4778]: I0318 09:29:34.458956 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:34 crc kubenswrapper[4778]: I0318 09:29:34.532563 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:35 crc kubenswrapper[4778]: I0318 09:29:35.244889 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpgxl"] Mar 18 09:29:35 crc kubenswrapper[4778]: I0318 09:29:35.796068 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tpgxl" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="registry-server" containerID="cri-o://f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8" gracePeriod=2 Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.356075 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.492614 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-catalog-content\") pod \"418d5e86-72b6-4030-b9e6-9d9402174c5c\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.492749 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-utilities\") pod \"418d5e86-72b6-4030-b9e6-9d9402174c5c\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.492781 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md686\" (UniqueName: \"kubernetes.io/projected/418d5e86-72b6-4030-b9e6-9d9402174c5c-kube-api-access-md686\") pod \"418d5e86-72b6-4030-b9e6-9d9402174c5c\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.493490 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-utilities" (OuterVolumeSpecName: "utilities") pod "418d5e86-72b6-4030-b9e6-9d9402174c5c" (UID: "418d5e86-72b6-4030-b9e6-9d9402174c5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.502811 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/418d5e86-72b6-4030-b9e6-9d9402174c5c-kube-api-access-md686" (OuterVolumeSpecName: "kube-api-access-md686") pod "418d5e86-72b6-4030-b9e6-9d9402174c5c" (UID: "418d5e86-72b6-4030-b9e6-9d9402174c5c"). InnerVolumeSpecName "kube-api-access-md686". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.594673 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.594732 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md686\" (UniqueName: \"kubernetes.io/projected/418d5e86-72b6-4030-b9e6-9d9402174c5c-kube-api-access-md686\") on node \"crc\" DevicePath \"\"" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.632889 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "418d5e86-72b6-4030-b9e6-9d9402174c5c" (UID: "418d5e86-72b6-4030-b9e6-9d9402174c5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.696691 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.813069 4778 generic.go:334] "Generic (PLEG): container finished" podID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerID="f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8" exitCode=0 Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.813120 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerDied","Data":"f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8"} Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.813150 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerDied","Data":"46e2104e09d595fc9eab76f4e7ebdca7f02555a48577717019a50c200d0c3244"} Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.813173 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.813182 4778 scope.go:117] "RemoveContainer" containerID="f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.864549 4778 scope.go:117] "RemoveContainer" containerID="14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.894573 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpgxl"] Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.912540 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tpgxl"] Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.943087 4778 scope.go:117] "RemoveContainer" containerID="cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.997494 4778 scope.go:117] "RemoveContainer" containerID="f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8" Mar 18 09:29:37 crc kubenswrapper[4778]: E0318 09:29:37.012357 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8\": container with ID starting with f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8 not found: ID does not exist" containerID="f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8" Mar 18 09:29:37 crc kubenswrapper[4778]: I0318 09:29:37.012415 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8"} err="failed to get container status \"f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8\": rpc error: code = NotFound desc = could not find container \"f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8\": container with ID starting with f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8 not found: ID does not exist" Mar 18 09:29:37 crc kubenswrapper[4778]: I0318 09:29:37.012440 4778 scope.go:117] "RemoveContainer" containerID="14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380" Mar 18 09:29:37 crc kubenswrapper[4778]: E0318 09:29:37.021353 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380\": container with ID starting with 14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380 not found: ID does not exist" containerID="14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380" Mar 18 09:29:37 crc kubenswrapper[4778]: I0318 09:29:37.021603 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380"} err="failed to get container status \"14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380\": rpc error: code = NotFound desc = could not find container \"14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380\": container with ID starting with 14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380 not found: ID does not exist" Mar 18 09:29:37 crc kubenswrapper[4778]: I0318 09:29:37.021689 4778 scope.go:117] "RemoveContainer" containerID="cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc" Mar 18 09:29:37 crc kubenswrapper[4778]: E0318 09:29:37.023396 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc\": container with ID starting with cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc not found: ID does not exist" containerID="cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc" Mar 18 09:29:37 crc kubenswrapper[4778]: I0318 09:29:37.023489 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc"} err="failed to get container status \"cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc\": rpc error: code = NotFound desc = could not find container \"cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc\": container with ID starting with cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc not found: ID does not exist" Mar 18 09:29:38 crc kubenswrapper[4778]: I0318 09:29:38.197921 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" path="/var/lib/kubelet/pods/418d5e86-72b6-4030-b9e6-9d9402174c5c/volumes" Mar 18 09:29:45 crc kubenswrapper[4778]: I0318 09:29:45.186844 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:29:45 crc kubenswrapper[4778]: E0318 09:29:45.187786 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:29:59 crc kubenswrapper[4778]: I0318 09:29:59.188065 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:29:59 crc kubenswrapper[4778]: E0318 09:29:59.189295 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.163487 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq"] Mar 18 09:30:00 crc kubenswrapper[4778]: E0318 09:30:00.164177 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="extract-utilities" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.164243 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="extract-utilities" Mar 18 09:30:00 crc kubenswrapper[4778]: E0318 09:30:00.164287 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="extract-content" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.164302 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="extract-content" Mar 18 09:30:00 crc kubenswrapper[4778]: E0318 09:30:00.164339 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="registry-server" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.164354 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="registry-server" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.164731 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="registry-server" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.165842 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.172013 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.172133 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.176651 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq"] Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.214536 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563770-5f9th"] Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.215931 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.218896 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.219319 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.219499 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.237282 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563770-5f9th"] Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.254695 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b2bddb-d94d-426e-bc18-8b864785e323-secret-volume\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.254848 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlhxl\" (UniqueName: \"kubernetes.io/projected/b6b2bddb-d94d-426e-bc18-8b864785e323-kube-api-access-xlhxl\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.254947 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b2bddb-d94d-426e-bc18-8b864785e323-config-volume\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.254975 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdclz\" (UniqueName: \"kubernetes.io/projected/d5229065-e84e-4d42-870f-1ee468bff359-kube-api-access-sdclz\") pod \"auto-csr-approver-29563770-5f9th\" (UID: \"d5229065-e84e-4d42-870f-1ee468bff359\") " pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.356008 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlhxl\" (UniqueName: \"kubernetes.io/projected/b6b2bddb-d94d-426e-bc18-8b864785e323-kube-api-access-xlhxl\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.356105 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b2bddb-d94d-426e-bc18-8b864785e323-config-volume\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.356140 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdclz\" (UniqueName: \"kubernetes.io/projected/d5229065-e84e-4d42-870f-1ee468bff359-kube-api-access-sdclz\") pod \"auto-csr-approver-29563770-5f9th\" (UID: \"d5229065-e84e-4d42-870f-1ee468bff359\") " pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.356247 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b2bddb-d94d-426e-bc18-8b864785e323-secret-volume\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.357123 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b2bddb-d94d-426e-bc18-8b864785e323-config-volume\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.366510 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b2bddb-d94d-426e-bc18-8b864785e323-secret-volume\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.374754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdclz\" (UniqueName: \"kubernetes.io/projected/d5229065-e84e-4d42-870f-1ee468bff359-kube-api-access-sdclz\") pod \"auto-csr-approver-29563770-5f9th\" (UID: \"d5229065-e84e-4d42-870f-1ee468bff359\") " pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.381395 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlhxl\" (UniqueName: \"kubernetes.io/projected/b6b2bddb-d94d-426e-bc18-8b864785e323-kube-api-access-xlhxl\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.492072 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.534455 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:01 crc kubenswrapper[4778]: I0318 09:30:01.014072 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq"] Mar 18 09:30:01 crc kubenswrapper[4778]: I0318 09:30:01.083123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" event={"ID":"b6b2bddb-d94d-426e-bc18-8b864785e323","Type":"ContainerStarted","Data":"188edaa43cc00dcfb6862af68da42938b2b5d43a074ba7244be6903f72115021"} Mar 18 09:30:01 crc kubenswrapper[4778]: I0318 09:30:01.083762 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563770-5f9th"] Mar 18 09:30:01 crc kubenswrapper[4778]: W0318 09:30:01.095887 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5229065_e84e_4d42_870f_1ee468bff359.slice/crio-dcd92a70f4238a237b5e0cd6ca40b1127dff72982d1f0f31b6ea951fbacd43e8 WatchSource:0}: Error finding container dcd92a70f4238a237b5e0cd6ca40b1127dff72982d1f0f31b6ea951fbacd43e8: Status 404 returned error can't find the container with id dcd92a70f4238a237b5e0cd6ca40b1127dff72982d1f0f31b6ea951fbacd43e8 Mar 18 09:30:02 crc kubenswrapper[4778]: I0318 09:30:02.099382 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563770-5f9th" event={"ID":"d5229065-e84e-4d42-870f-1ee468bff359","Type":"ContainerStarted","Data":"dcd92a70f4238a237b5e0cd6ca40b1127dff72982d1f0f31b6ea951fbacd43e8"} Mar 18 09:30:02 crc kubenswrapper[4778]: I0318 09:30:02.102021 4778 generic.go:334] "Generic (PLEG): container finished" podID="b6b2bddb-d94d-426e-bc18-8b864785e323" containerID="04359ca445cb3566112d245be577eaabe4ab24e27a18fca03074e13b6e3b403f" exitCode=0 Mar 18 09:30:02 crc kubenswrapper[4778]: I0318 09:30:02.102065 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" event={"ID":"b6b2bddb-d94d-426e-bc18-8b864785e323","Type":"ContainerDied","Data":"04359ca445cb3566112d245be577eaabe4ab24e27a18fca03074e13b6e3b403f"} Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.489660 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.543777 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b2bddb-d94d-426e-bc18-8b864785e323-config-volume\") pod \"b6b2bddb-d94d-426e-bc18-8b864785e323\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.544168 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b2bddb-d94d-426e-bc18-8b864785e323-secret-volume\") pod \"b6b2bddb-d94d-426e-bc18-8b864785e323\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.544411 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlhxl\" (UniqueName: \"kubernetes.io/projected/b6b2bddb-d94d-426e-bc18-8b864785e323-kube-api-access-xlhxl\") pod \"b6b2bddb-d94d-426e-bc18-8b864785e323\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.546381 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b2bddb-d94d-426e-bc18-8b864785e323-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6b2bddb-d94d-426e-bc18-8b864785e323" (UID: "b6b2bddb-d94d-426e-bc18-8b864785e323"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.550522 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b2bddb-d94d-426e-bc18-8b864785e323-kube-api-access-xlhxl" (OuterVolumeSpecName: "kube-api-access-xlhxl") pod "b6b2bddb-d94d-426e-bc18-8b864785e323" (UID: "b6b2bddb-d94d-426e-bc18-8b864785e323"). InnerVolumeSpecName "kube-api-access-xlhxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.575311 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b2bddb-d94d-426e-bc18-8b864785e323-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6b2bddb-d94d-426e-bc18-8b864785e323" (UID: "b6b2bddb-d94d-426e-bc18-8b864785e323"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.647154 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlhxl\" (UniqueName: \"kubernetes.io/projected/b6b2bddb-d94d-426e-bc18-8b864785e323-kube-api-access-xlhxl\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.647179 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b2bddb-d94d-426e-bc18-8b864785e323-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.647188 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b2bddb-d94d-426e-bc18-8b864785e323-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:04 crc kubenswrapper[4778]: I0318 09:30:04.121618 4778 generic.go:334] "Generic (PLEG): container finished" podID="d5229065-e84e-4d42-870f-1ee468bff359" containerID="8faf9c7a656879007008e10d6b7f5d22a002ddd8fac9065c9f561e0e336487fd" exitCode=0 Mar 18 09:30:04 crc kubenswrapper[4778]: I0318 09:30:04.121948 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563770-5f9th" event={"ID":"d5229065-e84e-4d42-870f-1ee468bff359","Type":"ContainerDied","Data":"8faf9c7a656879007008e10d6b7f5d22a002ddd8fac9065c9f561e0e336487fd"} Mar 18 09:30:04 crc kubenswrapper[4778]: I0318 09:30:04.124679 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" event={"ID":"b6b2bddb-d94d-426e-bc18-8b864785e323","Type":"ContainerDied","Data":"188edaa43cc00dcfb6862af68da42938b2b5d43a074ba7244be6903f72115021"} Mar 18 09:30:04 crc kubenswrapper[4778]: I0318 09:30:04.124724 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="188edaa43cc00dcfb6862af68da42938b2b5d43a074ba7244be6903f72115021" Mar 18 09:30:04 crc kubenswrapper[4778]: I0318 09:30:04.124824 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:05 crc kubenswrapper[4778]: I0318 09:30:05.489022 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:05 crc kubenswrapper[4778]: I0318 09:30:05.581020 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdclz\" (UniqueName: \"kubernetes.io/projected/d5229065-e84e-4d42-870f-1ee468bff359-kube-api-access-sdclz\") pod \"d5229065-e84e-4d42-870f-1ee468bff359\" (UID: \"d5229065-e84e-4d42-870f-1ee468bff359\") " Mar 18 09:30:05 crc kubenswrapper[4778]: I0318 09:30:05.588002 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5229065-e84e-4d42-870f-1ee468bff359-kube-api-access-sdclz" (OuterVolumeSpecName: "kube-api-access-sdclz") pod "d5229065-e84e-4d42-870f-1ee468bff359" (UID: "d5229065-e84e-4d42-870f-1ee468bff359"). InnerVolumeSpecName "kube-api-access-sdclz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:30:05 crc kubenswrapper[4778]: I0318 09:30:05.683814 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdclz\" (UniqueName: \"kubernetes.io/projected/d5229065-e84e-4d42-870f-1ee468bff359-kube-api-access-sdclz\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:06 crc kubenswrapper[4778]: I0318 09:30:06.153138 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563770-5f9th" event={"ID":"d5229065-e84e-4d42-870f-1ee468bff359","Type":"ContainerDied","Data":"dcd92a70f4238a237b5e0cd6ca40b1127dff72982d1f0f31b6ea951fbacd43e8"} Mar 18 09:30:06 crc kubenswrapper[4778]: I0318 09:30:06.153185 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:06 crc kubenswrapper[4778]: I0318 09:30:06.153188 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcd92a70f4238a237b5e0cd6ca40b1127dff72982d1f0f31b6ea951fbacd43e8" Mar 18 09:30:06 crc kubenswrapper[4778]: I0318 09:30:06.577685 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563764-s4crc"] Mar 18 09:30:06 crc kubenswrapper[4778]: I0318 09:30:06.589749 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563764-s4crc"] Mar 18 09:30:08 crc kubenswrapper[4778]: I0318 09:30:08.200668 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb3b2d75-fc85-48dc-8533-18ecd8c75187" path="/var/lib/kubelet/pods/bb3b2d75-fc85-48dc-8533-18ecd8c75187/volumes" Mar 18 09:30:13 crc kubenswrapper[4778]: I0318 09:30:13.187690 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:30:13 crc kubenswrapper[4778]: E0318 09:30:13.188495 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:30:26 crc kubenswrapper[4778]: I0318 09:30:26.189186 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:30:26 crc kubenswrapper[4778]: E0318 09:30:26.190154 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:30:29 crc kubenswrapper[4778]: I0318 09:30:29.108169 4778 scope.go:117] "RemoveContainer" containerID="2ec42f2618fc279e3b11e295de9307609aec937968481eb47ae40bf89eeec176" Mar 18 09:30:39 crc kubenswrapper[4778]: I0318 09:30:39.187968 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:30:39 crc kubenswrapper[4778]: E0318 09:30:39.188935 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:30:52 crc kubenswrapper[4778]: I0318 09:30:52.188700 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:30:52 crc kubenswrapper[4778]: E0318 09:30:52.189393 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:31:04 crc kubenswrapper[4778]: I0318 09:31:04.770577 4778 generic.go:334] "Generic (PLEG): container finished" podID="b989f767-d1ba-49fe-aebb-6aef120e0e22" containerID="df748c22cbd9cfd719213cf439a446ed8f2c405ec832bdc5f38d5aacebbce9a9" exitCode=0 Mar 18 09:31:04 crc kubenswrapper[4778]: I0318 09:31:04.770659 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" event={"ID":"b989f767-d1ba-49fe-aebb-6aef120e0e22","Type":"ContainerDied","Data":"df748c22cbd9cfd719213cf439a446ed8f2c405ec832bdc5f38d5aacebbce9a9"} Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.223948 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.385339 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-bootstrap-combined-ca-bundle\") pod \"b989f767-d1ba-49fe-aebb-6aef120e0e22\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.385402 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbscl\" (UniqueName: \"kubernetes.io/projected/b989f767-d1ba-49fe-aebb-6aef120e0e22-kube-api-access-kbscl\") pod \"b989f767-d1ba-49fe-aebb-6aef120e0e22\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.385494 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-inventory\") pod \"b989f767-d1ba-49fe-aebb-6aef120e0e22\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.385527 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-ssh-key-openstack-edpm-ipam\") pod \"b989f767-d1ba-49fe-aebb-6aef120e0e22\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.393317 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b989f767-d1ba-49fe-aebb-6aef120e0e22" (UID: "b989f767-d1ba-49fe-aebb-6aef120e0e22"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.405679 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b989f767-d1ba-49fe-aebb-6aef120e0e22-kube-api-access-kbscl" (OuterVolumeSpecName: "kube-api-access-kbscl") pod "b989f767-d1ba-49fe-aebb-6aef120e0e22" (UID: "b989f767-d1ba-49fe-aebb-6aef120e0e22"). InnerVolumeSpecName "kube-api-access-kbscl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.414410 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-inventory" (OuterVolumeSpecName: "inventory") pod "b989f767-d1ba-49fe-aebb-6aef120e0e22" (UID: "b989f767-d1ba-49fe-aebb-6aef120e0e22"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.432904 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b989f767-d1ba-49fe-aebb-6aef120e0e22" (UID: "b989f767-d1ba-49fe-aebb-6aef120e0e22"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.486981 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.487014 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.487025 4778 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.487034 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbscl\" (UniqueName: \"kubernetes.io/projected/b989f767-d1ba-49fe-aebb-6aef120e0e22-kube-api-access-kbscl\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.801804 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" event={"ID":"b989f767-d1ba-49fe-aebb-6aef120e0e22","Type":"ContainerDied","Data":"b5bc788ebd6b1883d8a8fc2b7f9f391272c9ac7329e6e4d8ed1d79fca68aaffa"} Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.801862 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5bc788ebd6b1883d8a8fc2b7f9f391272c9ac7329e6e4d8ed1d79fca68aaffa" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.801926 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.917515 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf"] Mar 18 09:31:06 crc kubenswrapper[4778]: E0318 09:31:06.918067 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5229065-e84e-4d42-870f-1ee468bff359" containerName="oc" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.918098 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5229065-e84e-4d42-870f-1ee468bff359" containerName="oc" Mar 18 09:31:06 crc kubenswrapper[4778]: E0318 09:31:06.918136 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b989f767-d1ba-49fe-aebb-6aef120e0e22" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.918152 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b989f767-d1ba-49fe-aebb-6aef120e0e22" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 09:31:06 crc kubenswrapper[4778]: E0318 09:31:06.918187 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b2bddb-d94d-426e-bc18-8b864785e323" containerName="collect-profiles" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.918224 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b2bddb-d94d-426e-bc18-8b864785e323" containerName="collect-profiles" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.919388 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b989f767-d1ba-49fe-aebb-6aef120e0e22" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.919444 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b2bddb-d94d-426e-bc18-8b864785e323" containerName="collect-profiles" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.919476 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5229065-e84e-4d42-870f-1ee468bff359" containerName="oc" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.920648 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.926792 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.926924 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.927727 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.928103 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.937527 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf"] Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.996839 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.996989 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlmz6\" (UniqueName: \"kubernetes.io/projected/2fe04bef-41cb-47c4-8031-141f8809e8cb-kube-api-access-wlmz6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.997163 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.099426 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.099537 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlmz6\" (UniqueName: \"kubernetes.io/projected/2fe04bef-41cb-47c4-8031-141f8809e8cb-kube-api-access-wlmz6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.099634 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.106073 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.113997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.129181 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlmz6\" (UniqueName: \"kubernetes.io/projected/2fe04bef-41cb-47c4-8031-141f8809e8cb-kube-api-access-wlmz6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.186945 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:31:07 crc kubenswrapper[4778]: E0318 09:31:07.187213 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.240930 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.946038 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf"] Mar 18 09:31:08 crc kubenswrapper[4778]: I0318 09:31:08.827930 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" event={"ID":"2fe04bef-41cb-47c4-8031-141f8809e8cb","Type":"ContainerStarted","Data":"448fd951e8632e6cb54458ae0feebd671c32acd586b867a21adf3d37b278a94c"} Mar 18 09:31:09 crc kubenswrapper[4778]: I0318 09:31:09.841476 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" event={"ID":"2fe04bef-41cb-47c4-8031-141f8809e8cb","Type":"ContainerStarted","Data":"fd288c3256024cadd2bb212c37d37772aadd4cda1a6ce7e57e524f08cb5c87a0"} Mar 18 09:31:09 crc kubenswrapper[4778]: I0318 09:31:09.872374 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" podStartSLOduration=2.351412167 podStartE2EDuration="3.872353567s" podCreationTimestamp="2026-03-18 09:31:06 +0000 UTC" firstStartedPulling="2026-03-18 09:31:07.951825746 +0000 UTC m=+1734.526570606" lastFinishedPulling="2026-03-18 09:31:09.472767126 +0000 UTC m=+1736.047512006" observedRunningTime="2026-03-18 09:31:09.860782844 +0000 UTC m=+1736.435527684" watchObservedRunningTime="2026-03-18 09:31:09.872353567 +0000 UTC m=+1736.447098407" Mar 18 09:31:20 crc kubenswrapper[4778]: I0318 09:31:20.188343 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:31:20 crc kubenswrapper[4778]: E0318 09:31:20.189612 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:31:31 crc kubenswrapper[4778]: I0318 09:31:31.187407 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:31:31 crc kubenswrapper[4778]: E0318 09:31:31.188265 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:31:44 crc kubenswrapper[4778]: I0318 09:31:44.200854 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:31:44 crc kubenswrapper[4778]: E0318 09:31:44.203762 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:31:56 crc kubenswrapper[4778]: I0318 09:31:56.189679 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:31:56 crc kubenswrapper[4778]: E0318 09:31:56.190708 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.163715 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563772-tcj6t"] Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.166812 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.169104 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.169316 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.169519 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.183092 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563772-tcj6t"] Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.314729 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrt4f\" (UniqueName: \"kubernetes.io/projected/15232b66-3433-4405-9feb-79055e892b3d-kube-api-access-zrt4f\") pod \"auto-csr-approver-29563772-tcj6t\" (UID: \"15232b66-3433-4405-9feb-79055e892b3d\") " pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.416469 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrt4f\" (UniqueName: \"kubernetes.io/projected/15232b66-3433-4405-9feb-79055e892b3d-kube-api-access-zrt4f\") pod \"auto-csr-approver-29563772-tcj6t\" (UID: \"15232b66-3433-4405-9feb-79055e892b3d\") " pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.438592 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrt4f\" (UniqueName: \"kubernetes.io/projected/15232b66-3433-4405-9feb-79055e892b3d-kube-api-access-zrt4f\") pod \"auto-csr-approver-29563772-tcj6t\" (UID: \"15232b66-3433-4405-9feb-79055e892b3d\") " pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.491260 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:01 crc kubenswrapper[4778]: I0318 09:32:01.055301 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563772-tcj6t"] Mar 18 09:32:01 crc kubenswrapper[4778]: I0318 09:32:01.413688 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" event={"ID":"15232b66-3433-4405-9feb-79055e892b3d","Type":"ContainerStarted","Data":"db674ce39ebaf74f85133b048779d3fd48c7602d67b049f235224eed5592d6a3"} Mar 18 09:32:03 crc kubenswrapper[4778]: I0318 09:32:03.439063 4778 generic.go:334] "Generic (PLEG): container finished" podID="15232b66-3433-4405-9feb-79055e892b3d" containerID="c8ccd760df68dbd5ce4bef875e9b41962b50e1c9d6413d0f1f66a324748d7c49" exitCode=0 Mar 18 09:32:03 crc kubenswrapper[4778]: I0318 09:32:03.439180 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" event={"ID":"15232b66-3433-4405-9feb-79055e892b3d","Type":"ContainerDied","Data":"c8ccd760df68dbd5ce4bef875e9b41962b50e1c9d6413d0f1f66a324748d7c49"} Mar 18 09:32:04 crc kubenswrapper[4778]: I0318 09:32:04.771444 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:04 crc kubenswrapper[4778]: I0318 09:32:04.906384 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrt4f\" (UniqueName: \"kubernetes.io/projected/15232b66-3433-4405-9feb-79055e892b3d-kube-api-access-zrt4f\") pod \"15232b66-3433-4405-9feb-79055e892b3d\" (UID: \"15232b66-3433-4405-9feb-79055e892b3d\") " Mar 18 09:32:04 crc kubenswrapper[4778]: I0318 09:32:04.916575 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15232b66-3433-4405-9feb-79055e892b3d-kube-api-access-zrt4f" (OuterVolumeSpecName: "kube-api-access-zrt4f") pod "15232b66-3433-4405-9feb-79055e892b3d" (UID: "15232b66-3433-4405-9feb-79055e892b3d"). InnerVolumeSpecName "kube-api-access-zrt4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:32:05 crc kubenswrapper[4778]: I0318 09:32:05.009037 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrt4f\" (UniqueName: \"kubernetes.io/projected/15232b66-3433-4405-9feb-79055e892b3d-kube-api-access-zrt4f\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:05 crc kubenswrapper[4778]: I0318 09:32:05.465472 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" event={"ID":"15232b66-3433-4405-9feb-79055e892b3d","Type":"ContainerDied","Data":"db674ce39ebaf74f85133b048779d3fd48c7602d67b049f235224eed5592d6a3"} Mar 18 09:32:05 crc kubenswrapper[4778]: I0318 09:32:05.465528 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db674ce39ebaf74f85133b048779d3fd48c7602d67b049f235224eed5592d6a3" Mar 18 09:32:05 crc kubenswrapper[4778]: I0318 09:32:05.465579 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:05 crc kubenswrapper[4778]: I0318 09:32:05.859670 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563766-lqhxm"] Mar 18 09:32:05 crc kubenswrapper[4778]: I0318 09:32:05.868755 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563766-lqhxm"] Mar 18 09:32:06 crc kubenswrapper[4778]: I0318 09:32:06.198978 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa" path="/var/lib/kubelet/pods/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa/volumes" Mar 18 09:32:07 crc kubenswrapper[4778]: I0318 09:32:07.031470 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5387-account-create-update-wm5k5"] Mar 18 09:32:07 crc kubenswrapper[4778]: I0318 09:32:07.041717 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-dz4jc"] Mar 18 09:32:07 crc kubenswrapper[4778]: I0318 09:32:07.049731 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5387-account-create-update-wm5k5"] Mar 18 09:32:07 crc kubenswrapper[4778]: I0318 09:32:07.058526 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-dz4jc"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.030987 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-l6vl8"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.040055 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a222-account-create-update-qr82t"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.051713 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a222-account-create-update-qr82t"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.061591 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-l6vl8"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.071861 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0a46-account-create-update-phb5p"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.079437 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0a46-account-create-update-phb5p"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.280056 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24412394-390b-461c-9d18-617eba706adc" path="/var/lib/kubelet/pods/24412394-390b-461c-9d18-617eba706adc/volumes" Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.280735 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" path="/var/lib/kubelet/pods/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9/volumes" Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.281281 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a820a6-6a95-4ab7-a9d8-6649fe45464a" path="/var/lib/kubelet/pods/51a820a6-6a95-4ab7-a9d8-6649fe45464a/volumes" Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.281800 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ae14ca-efde-42ba-8edf-7cc34dc31036" path="/var/lib/kubelet/pods/c7ae14ca-efde-42ba-8edf-7cc34dc31036/volumes" Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.282839 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9222d9a-6507-4c32-9234-2c1c2b27a11e" path="/var/lib/kubelet/pods/f9222d9a-6507-4c32-9234-2c1c2b27a11e/volumes" Mar 18 09:32:09 crc kubenswrapper[4778]: I0318 09:32:09.046924 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kj8ww"] Mar 18 09:32:09 crc kubenswrapper[4778]: I0318 09:32:09.062744 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kj8ww"] Mar 18 09:32:09 crc kubenswrapper[4778]: I0318 09:32:09.188959 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:32:09 crc kubenswrapper[4778]: E0318 09:32:09.191475 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:32:10 crc kubenswrapper[4778]: I0318 09:32:10.202146 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be311af4-91f5-417e-971b-c9158576ca97" path="/var/lib/kubelet/pods/be311af4-91f5-417e-971b-c9158576ca97/volumes" Mar 18 09:32:17 crc kubenswrapper[4778]: I0318 09:32:17.582710 4778 generic.go:334] "Generic (PLEG): container finished" podID="2fe04bef-41cb-47c4-8031-141f8809e8cb" containerID="fd288c3256024cadd2bb212c37d37772aadd4cda1a6ce7e57e524f08cb5c87a0" exitCode=0 Mar 18 09:32:17 crc kubenswrapper[4778]: I0318 09:32:17.582811 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" event={"ID":"2fe04bef-41cb-47c4-8031-141f8809e8cb","Type":"ContainerDied","Data":"fd288c3256024cadd2bb212c37d37772aadd4cda1a6ce7e57e524f08cb5c87a0"} Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.062306 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.215697 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-inventory\") pod \"2fe04bef-41cb-47c4-8031-141f8809e8cb\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.215972 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-ssh-key-openstack-edpm-ipam\") pod \"2fe04bef-41cb-47c4-8031-141f8809e8cb\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.216045 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlmz6\" (UniqueName: \"kubernetes.io/projected/2fe04bef-41cb-47c4-8031-141f8809e8cb-kube-api-access-wlmz6\") pod \"2fe04bef-41cb-47c4-8031-141f8809e8cb\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.240875 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe04bef-41cb-47c4-8031-141f8809e8cb-kube-api-access-wlmz6" (OuterVolumeSpecName: "kube-api-access-wlmz6") pod "2fe04bef-41cb-47c4-8031-141f8809e8cb" (UID: "2fe04bef-41cb-47c4-8031-141f8809e8cb"). InnerVolumeSpecName "kube-api-access-wlmz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.247404 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-inventory" (OuterVolumeSpecName: "inventory") pod "2fe04bef-41cb-47c4-8031-141f8809e8cb" (UID: "2fe04bef-41cb-47c4-8031-141f8809e8cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.267345 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2fe04bef-41cb-47c4-8031-141f8809e8cb" (UID: "2fe04bef-41cb-47c4-8031-141f8809e8cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.318950 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.319010 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlmz6\" (UniqueName: \"kubernetes.io/projected/2fe04bef-41cb-47c4-8031-141f8809e8cb-kube-api-access-wlmz6\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.319032 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.608421 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" event={"ID":"2fe04bef-41cb-47c4-8031-141f8809e8cb","Type":"ContainerDied","Data":"448fd951e8632e6cb54458ae0feebd671c32acd586b867a21adf3d37b278a94c"} Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.608483 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.608509 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="448fd951e8632e6cb54458ae0feebd671c32acd586b867a21adf3d37b278a94c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.692029 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c"] Mar 18 09:32:19 crc kubenswrapper[4778]: E0318 09:32:19.692514 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15232b66-3433-4405-9feb-79055e892b3d" containerName="oc" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.692543 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="15232b66-3433-4405-9feb-79055e892b3d" containerName="oc" Mar 18 09:32:19 crc kubenswrapper[4778]: E0318 09:32:19.692582 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe04bef-41cb-47c4-8031-141f8809e8cb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.692597 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe04bef-41cb-47c4-8031-141f8809e8cb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.692892 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe04bef-41cb-47c4-8031-141f8809e8cb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.692940 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="15232b66-3433-4405-9feb-79055e892b3d" containerName="oc" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.693651 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.695523 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.696021 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.696841 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.700253 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.713477 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c"] Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.837545 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.837595 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdgb8\" (UniqueName: \"kubernetes.io/projected/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-kube-api-access-vdgb8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.837632 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.939887 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.939983 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdgb8\" (UniqueName: \"kubernetes.io/projected/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-kube-api-access-vdgb8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.940057 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.944054 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.944218 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.956850 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdgb8\" (UniqueName: \"kubernetes.io/projected/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-kube-api-access-vdgb8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:20 crc kubenswrapper[4778]: I0318 09:32:20.009250 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:20 crc kubenswrapper[4778]: I0318 09:32:20.557189 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c"] Mar 18 09:32:20 crc kubenswrapper[4778]: I0318 09:32:20.559534 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:32:20 crc kubenswrapper[4778]: I0318 09:32:20.618676 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" event={"ID":"8e3f07f1-8381-48a9-8ebb-9cd3a821783f","Type":"ContainerStarted","Data":"38f16aa0f38a4d4d585e71d4489ff45542b89cbc52957dcc955eb4dbf7f944b1"} Mar 18 09:32:21 crc kubenswrapper[4778]: I0318 09:32:21.626671 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" event={"ID":"8e3f07f1-8381-48a9-8ebb-9cd3a821783f","Type":"ContainerStarted","Data":"624102bc1fa0c3850d7e6900b631b277f6bac2b359e2ccf5e40af6d9d87d6742"} Mar 18 09:32:21 crc kubenswrapper[4778]: I0318 09:32:21.657637 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" podStartSLOduration=2.213186226 podStartE2EDuration="2.657612309s" podCreationTimestamp="2026-03-18 09:32:19 +0000 UTC" firstStartedPulling="2026-03-18 09:32:20.559130028 +0000 UTC m=+1807.133874878" lastFinishedPulling="2026-03-18 09:32:21.003556091 +0000 UTC m=+1807.578300961" observedRunningTime="2026-03-18 09:32:21.638654727 +0000 UTC m=+1808.213399617" watchObservedRunningTime="2026-03-18 09:32:21.657612309 +0000 UTC m=+1808.232357159" Mar 18 09:32:23 crc kubenswrapper[4778]: I0318 09:32:23.187322 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:32:23 crc kubenswrapper[4778]: E0318 09:32:23.187680 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:32:25 crc kubenswrapper[4778]: I0318 09:32:25.676692 4778 generic.go:334] "Generic (PLEG): container finished" podID="8e3f07f1-8381-48a9-8ebb-9cd3a821783f" containerID="624102bc1fa0c3850d7e6900b631b277f6bac2b359e2ccf5e40af6d9d87d6742" exitCode=0 Mar 18 09:32:25 crc kubenswrapper[4778]: I0318 09:32:25.676880 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" event={"ID":"8e3f07f1-8381-48a9-8ebb-9cd3a821783f","Type":"ContainerDied","Data":"624102bc1fa0c3850d7e6900b631b277f6bac2b359e2ccf5e40af6d9d87d6742"} Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.257602 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.381915 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-ssh-key-openstack-edpm-ipam\") pod \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.382001 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-inventory\") pod \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.382034 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdgb8\" (UniqueName: \"kubernetes.io/projected/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-kube-api-access-vdgb8\") pod \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.389223 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-kube-api-access-vdgb8" (OuterVolumeSpecName: "kube-api-access-vdgb8") pod "8e3f07f1-8381-48a9-8ebb-9cd3a821783f" (UID: "8e3f07f1-8381-48a9-8ebb-9cd3a821783f"). InnerVolumeSpecName "kube-api-access-vdgb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.418029 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8e3f07f1-8381-48a9-8ebb-9cd3a821783f" (UID: "8e3f07f1-8381-48a9-8ebb-9cd3a821783f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.420614 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-inventory" (OuterVolumeSpecName: "inventory") pod "8e3f07f1-8381-48a9-8ebb-9cd3a821783f" (UID: "8e3f07f1-8381-48a9-8ebb-9cd3a821783f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.484884 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.484921 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.484933 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdgb8\" (UniqueName: \"kubernetes.io/projected/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-kube-api-access-vdgb8\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.698778 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" event={"ID":"8e3f07f1-8381-48a9-8ebb-9cd3a821783f","Type":"ContainerDied","Data":"38f16aa0f38a4d4d585e71d4489ff45542b89cbc52957dcc955eb4dbf7f944b1"} Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.699122 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38f16aa0f38a4d4d585e71d4489ff45542b89cbc52957dcc955eb4dbf7f944b1" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.698938 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.841893 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk"] Mar 18 09:32:27 crc kubenswrapper[4778]: E0318 09:32:27.842346 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e3f07f1-8381-48a9-8ebb-9cd3a821783f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.842369 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e3f07f1-8381-48a9-8ebb-9cd3a821783f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.842586 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e3f07f1-8381-48a9-8ebb-9cd3a821783f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.843324 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.847364 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.847611 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.847972 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.849471 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.862078 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk"] Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.893275 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.893510 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.893588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsjp6\" (UniqueName: \"kubernetes.io/projected/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-kube-api-access-vsjp6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.994795 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.995075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.995185 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsjp6\" (UniqueName: \"kubernetes.io/projected/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-kube-api-access-vsjp6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.998764 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:28 crc kubenswrapper[4778]: I0318 09:32:28.000556 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:28 crc kubenswrapper[4778]: I0318 09:32:28.023523 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsjp6\" (UniqueName: \"kubernetes.io/projected/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-kube-api-access-vsjp6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:28 crc kubenswrapper[4778]: I0318 09:32:28.164053 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:28 crc kubenswrapper[4778]: I0318 09:32:28.758701 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk"] Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.252361 4778 scope.go:117] "RemoveContainer" containerID="70fd7ebf08e80da75227c830c21c112cbd85c345b44bad8cd8c81cd3f4b7fd7e" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.275895 4778 scope.go:117] "RemoveContainer" containerID="c677c641e634b2b60d1bae546bbf8e8d9cc5553eb522dd4d67a26e72bf3f0752" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.329665 4778 scope.go:117] "RemoveContainer" containerID="2d1134d737bb1ad4d1096a8735192a53e75e70709be8071f894f8def68f8db65" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.449695 4778 scope.go:117] "RemoveContainer" containerID="9b6fa295a9bfec83f890b1dd7210afd8d93f1d1f4da240bfbc29bf8af750edd0" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.512569 4778 scope.go:117] "RemoveContainer" containerID="397a643cfe9ac0a1d5786dfe10182c0fd656474f01a6f126523a52404ea544a7" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.547469 4778 scope.go:117] "RemoveContainer" containerID="c24773f49ad71f17b93d5ac7609065bb82dac185ae78530f1dcf0ecca87ade20" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.584412 4778 scope.go:117] "RemoveContainer" containerID="39a152a6bb8ed07675c14ece0ad21851da7a9e9a103afe2312ea0254cd99b29c" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.727172 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" event={"ID":"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803","Type":"ContainerStarted","Data":"9d9ac3d0a2c7513d5a45e8e6d19a1411862166adf069327b3e174ecc2c3c3a28"} Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.727685 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" event={"ID":"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803","Type":"ContainerStarted","Data":"80e8fcc25351430440263b6d5eaa255eb52e520388064f88dde7e70815ffe72a"} Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.755092 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" podStartSLOduration=2.178081733 podStartE2EDuration="2.755070199s" podCreationTimestamp="2026-03-18 09:32:27 +0000 UTC" firstStartedPulling="2026-03-18 09:32:28.766137988 +0000 UTC m=+1815.340882818" lastFinishedPulling="2026-03-18 09:32:29.343126434 +0000 UTC m=+1815.917871284" observedRunningTime="2026-03-18 09:32:29.739888508 +0000 UTC m=+1816.314633358" watchObservedRunningTime="2026-03-18 09:32:29.755070199 +0000 UTC m=+1816.329815049" Mar 18 09:32:32 crc kubenswrapper[4778]: I0318 09:32:32.056335 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zrjls"] Mar 18 09:32:32 crc kubenswrapper[4778]: I0318 09:32:32.093737 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zrjls"] Mar 18 09:32:32 crc kubenswrapper[4778]: I0318 09:32:32.199339 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8560ebac-334f-4332-b324-cdb297a94b1a" path="/var/lib/kubelet/pods/8560ebac-334f-4332-b324-cdb297a94b1a/volumes" Mar 18 09:32:37 crc kubenswrapper[4778]: I0318 09:32:37.038055 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-b66ph"] Mar 18 09:32:37 crc kubenswrapper[4778]: I0318 09:32:37.050132 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-b66ph"] Mar 18 09:32:37 crc kubenswrapper[4778]: I0318 09:32:37.188093 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:32:37 crc kubenswrapper[4778]: E0318 09:32:37.188516 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:32:38 crc kubenswrapper[4778]: I0318 09:32:38.201370 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dadb643-21f7-497a-992f-41ab80c704c5" path="/var/lib/kubelet/pods/5dadb643-21f7-497a-992f-41ab80c704c5/volumes" Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.060547 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-2cxtn"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.072036 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-sz5dt"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.082783 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b89b-account-create-update-ff8z8"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.091155 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-q979b"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.098001 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4e8f-account-create-update-ztvnt"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.104605 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6980-account-create-update-8lctt"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.111408 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-2cxtn"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.117932 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-sz5dt"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.125507 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4e8f-account-create-update-ztvnt"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.132178 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b89b-account-create-update-ff8z8"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.139424 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6980-account-create-update-8lctt"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.147254 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-q979b"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.198004 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320c5adc-a7d8-47a3-893b-7614c755446d" path="/var/lib/kubelet/pods/320c5adc-a7d8-47a3-893b-7614c755446d/volumes" Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.198756 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" path="/var/lib/kubelet/pods/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e/volumes" Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.199410 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" path="/var/lib/kubelet/pods/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7/volumes" Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.199959 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf66d17-48b6-4629-ae0c-e270afa0c88a" path="/var/lib/kubelet/pods/7cf66d17-48b6-4629-ae0c-e270afa0c88a/volumes" Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.201032 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9719662a-4248-4c3c-860b-1a9e6547876b" path="/var/lib/kubelet/pods/9719662a-4248-4c3c-860b-1a9e6547876b/volumes" Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.201626 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dca6e4b2-4722-4a45-b577-33f3c5090fc3" path="/var/lib/kubelet/pods/dca6e4b2-4722-4a45-b577-33f3c5090fc3/volumes" Mar 18 09:32:50 crc kubenswrapper[4778]: I0318 09:32:50.187505 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:32:50 crc kubenswrapper[4778]: E0318 09:32:50.188063 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:32:53 crc kubenswrapper[4778]: I0318 09:32:53.039661 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-29tr5"] Mar 18 09:32:53 crc kubenswrapper[4778]: I0318 09:32:53.048902 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-29tr5"] Mar 18 09:32:54 crc kubenswrapper[4778]: I0318 09:32:54.198845 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6efa68-d15c-4d69-bd52-853a7cef8299" path="/var/lib/kubelet/pods/bb6efa68-d15c-4d69-bd52-853a7cef8299/volumes" Mar 18 09:33:03 crc kubenswrapper[4778]: I0318 09:33:03.187694 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:33:03 crc kubenswrapper[4778]: E0318 09:33:03.188513 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:33:08 crc kubenswrapper[4778]: I0318 09:33:08.156900 4778 generic.go:334] "Generic (PLEG): container finished" podID="9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" containerID="9d9ac3d0a2c7513d5a45e8e6d19a1411862166adf069327b3e174ecc2c3c3a28" exitCode=0 Mar 18 09:33:08 crc kubenswrapper[4778]: I0318 09:33:08.156995 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" event={"ID":"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803","Type":"ContainerDied","Data":"9d9ac3d0a2c7513d5a45e8e6d19a1411862166adf069327b3e174ecc2c3c3a28"} Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.576091 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.651409 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-ssh-key-openstack-edpm-ipam\") pod \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.676709 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" (UID: "9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.752937 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-inventory\") pod \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.753120 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsjp6\" (UniqueName: \"kubernetes.io/projected/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-kube-api-access-vsjp6\") pod \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.753523 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.758060 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-kube-api-access-vsjp6" (OuterVolumeSpecName: "kube-api-access-vsjp6") pod "9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" (UID: "9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803"). InnerVolumeSpecName "kube-api-access-vsjp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.778491 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-inventory" (OuterVolumeSpecName: "inventory") pod "9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" (UID: "9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.855585 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsjp6\" (UniqueName: \"kubernetes.io/projected/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-kube-api-access-vsjp6\") on node \"crc\" DevicePath \"\"" Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.855618 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.184160 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" event={"ID":"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803","Type":"ContainerDied","Data":"80e8fcc25351430440263b6d5eaa255eb52e520388064f88dde7e70815ffe72a"} Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.184272 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80e8fcc25351430440263b6d5eaa255eb52e520388064f88dde7e70815ffe72a" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.184311 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.280327 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf"] Mar 18 09:33:10 crc kubenswrapper[4778]: E0318 09:33:10.280835 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.280850 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.281030 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.281564 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.284439 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.284970 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.285458 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.288986 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.300910 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf"] Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.365591 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.365670 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.365871 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwvbs\" (UniqueName: \"kubernetes.io/projected/b33de03b-23ec-40c0-b309-0dd2024caf71-kube-api-access-bwvbs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.467371 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.467598 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwvbs\" (UniqueName: \"kubernetes.io/projected/b33de03b-23ec-40c0-b309-0dd2024caf71-kube-api-access-bwvbs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.467657 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.473410 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.474143 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.488636 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwvbs\" (UniqueName: \"kubernetes.io/projected/b33de03b-23ec-40c0-b309-0dd2024caf71-kube-api-access-bwvbs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.648778 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:11 crc kubenswrapper[4778]: I0318 09:33:11.002337 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf"] Mar 18 09:33:11 crc kubenswrapper[4778]: I0318 09:33:11.194191 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" event={"ID":"b33de03b-23ec-40c0-b309-0dd2024caf71","Type":"ContainerStarted","Data":"440d34e0cd124a64005a33484aec00463b1a2e180f1850432f84d55112a37b77"} Mar 18 09:33:12 crc kubenswrapper[4778]: I0318 09:33:12.205976 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" event={"ID":"b33de03b-23ec-40c0-b309-0dd2024caf71","Type":"ContainerStarted","Data":"fff08d52bb8eb70989a438cee10bfa00abf8f2f0d3255a5a38ba7fa52ac6fd7d"} Mar 18 09:33:12 crc kubenswrapper[4778]: I0318 09:33:12.232320 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" podStartSLOduration=1.7412338269999998 podStartE2EDuration="2.232300401s" podCreationTimestamp="2026-03-18 09:33:10 +0000 UTC" firstStartedPulling="2026-03-18 09:33:11.007530707 +0000 UTC m=+1857.582275547" lastFinishedPulling="2026-03-18 09:33:11.498597231 +0000 UTC m=+1858.073342121" observedRunningTime="2026-03-18 09:33:12.231578962 +0000 UTC m=+1858.806323812" watchObservedRunningTime="2026-03-18 09:33:12.232300401 +0000 UTC m=+1858.807045251" Mar 18 09:33:16 crc kubenswrapper[4778]: I0318 09:33:16.187754 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:33:16 crc kubenswrapper[4778]: E0318 09:33:16.188646 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:33:16 crc kubenswrapper[4778]: I0318 09:33:16.260500 4778 generic.go:334] "Generic (PLEG): container finished" podID="b33de03b-23ec-40c0-b309-0dd2024caf71" containerID="fff08d52bb8eb70989a438cee10bfa00abf8f2f0d3255a5a38ba7fa52ac6fd7d" exitCode=0 Mar 18 09:33:16 crc kubenswrapper[4778]: I0318 09:33:16.260560 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" event={"ID":"b33de03b-23ec-40c0-b309-0dd2024caf71","Type":"ContainerDied","Data":"fff08d52bb8eb70989a438cee10bfa00abf8f2f0d3255a5a38ba7fa52ac6fd7d"} Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.781541 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.947437 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-inventory\") pod \"b33de03b-23ec-40c0-b309-0dd2024caf71\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.947611 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwvbs\" (UniqueName: \"kubernetes.io/projected/b33de03b-23ec-40c0-b309-0dd2024caf71-kube-api-access-bwvbs\") pod \"b33de03b-23ec-40c0-b309-0dd2024caf71\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.947674 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-ssh-key-openstack-edpm-ipam\") pod \"b33de03b-23ec-40c0-b309-0dd2024caf71\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.954617 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b33de03b-23ec-40c0-b309-0dd2024caf71-kube-api-access-bwvbs" (OuterVolumeSpecName: "kube-api-access-bwvbs") pod "b33de03b-23ec-40c0-b309-0dd2024caf71" (UID: "b33de03b-23ec-40c0-b309-0dd2024caf71"). InnerVolumeSpecName "kube-api-access-bwvbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.988506 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-inventory" (OuterVolumeSpecName: "inventory") pod "b33de03b-23ec-40c0-b309-0dd2024caf71" (UID: "b33de03b-23ec-40c0-b309-0dd2024caf71"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.995918 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b33de03b-23ec-40c0-b309-0dd2024caf71" (UID: "b33de03b-23ec-40c0-b309-0dd2024caf71"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.050238 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.050312 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwvbs\" (UniqueName: \"kubernetes.io/projected/b33de03b-23ec-40c0-b309-0dd2024caf71-kube-api-access-bwvbs\") on node \"crc\" DevicePath \"\"" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.050337 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.281289 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" event={"ID":"b33de03b-23ec-40c0-b309-0dd2024caf71","Type":"ContainerDied","Data":"440d34e0cd124a64005a33484aec00463b1a2e180f1850432f84d55112a37b77"} Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.281334 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="440d34e0cd124a64005a33484aec00463b1a2e180f1850432f84d55112a37b77" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.281384 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.381063 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b"] Mar 18 09:33:18 crc kubenswrapper[4778]: E0318 09:33:18.381616 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b33de03b-23ec-40c0-b309-0dd2024caf71" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.381634 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b33de03b-23ec-40c0-b309-0dd2024caf71" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.381831 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b33de03b-23ec-40c0-b309-0dd2024caf71" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.382510 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.385347 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.385495 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.385577 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.387302 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.398037 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b"] Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.467809 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.467865 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.467910 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z7p2\" (UniqueName: \"kubernetes.io/projected/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-kube-api-access-2z7p2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.568878 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.569149 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.569234 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z7p2\" (UniqueName: \"kubernetes.io/projected/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-kube-api-access-2z7p2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.573413 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.575996 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.593442 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z7p2\" (UniqueName: \"kubernetes.io/projected/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-kube-api-access-2z7p2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.708969 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:19 crc kubenswrapper[4778]: I0318 09:33:19.293403 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b"] Mar 18 09:33:20 crc kubenswrapper[4778]: I0318 09:33:20.303727 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" event={"ID":"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1","Type":"ContainerStarted","Data":"e73cd1a7f2cc415ed8844ab76dbcafbf6d6af8f6453df69ed69f66ec8f314dde"} Mar 18 09:33:20 crc kubenswrapper[4778]: I0318 09:33:20.304054 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" event={"ID":"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1","Type":"ContainerStarted","Data":"0ecf314c0ee29bbd7c8f5498678c43576aa8cd1ac998b8bfc3d0c895c7d8e42f"} Mar 18 09:33:20 crc kubenswrapper[4778]: I0318 09:33:20.320861 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" podStartSLOduration=1.842169702 podStartE2EDuration="2.32081191s" podCreationTimestamp="2026-03-18 09:33:18 +0000 UTC" firstStartedPulling="2026-03-18 09:33:19.306971656 +0000 UTC m=+1865.881716496" lastFinishedPulling="2026-03-18 09:33:19.785613824 +0000 UTC m=+1866.360358704" observedRunningTime="2026-03-18 09:33:20.318731253 +0000 UTC m=+1866.893476093" watchObservedRunningTime="2026-03-18 09:33:20.32081191 +0000 UTC m=+1866.895556750" Mar 18 09:33:23 crc kubenswrapper[4778]: I0318 09:33:23.053818 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-zbghp"] Mar 18 09:33:23 crc kubenswrapper[4778]: I0318 09:33:23.063755 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-zbghp"] Mar 18 09:33:24 crc kubenswrapper[4778]: I0318 09:33:24.204096 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d42905f-c189-4021-834d-f2a81dae5a4a" path="/var/lib/kubelet/pods/4d42905f-c189-4021-834d-f2a81dae5a4a/volumes" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.187338 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:33:29 crc kubenswrapper[4778]: E0318 09:33:29.188223 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.730652 4778 scope.go:117] "RemoveContainer" containerID="70d53574867291895895df87d2a68bed084a005ff2e35622dba06f6dac00a1ee" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.763809 4778 scope.go:117] "RemoveContainer" containerID="8485de1959de5e473a1a0282e19bec9c8061e5419357cbeb799b7cc895e3b146" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.817378 4778 scope.go:117] "RemoveContainer" containerID="76d9700b7eab0fc318bf79deafd901e277918a900858f790ff6f7d08ee5e2133" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.863330 4778 scope.go:117] "RemoveContainer" containerID="abc212acc9fea22cd31581d6e2bb923603370c1ccdef0851c69537c07eedf089" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.917921 4778 scope.go:117] "RemoveContainer" containerID="aa018cf8a109c6b5750c2118b0eb74eb108759f278376c856e84638cf2d31164" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.936474 4778 scope.go:117] "RemoveContainer" containerID="61d521ae036849914a4701bb867f10a55ba71a80cea0eab40620c4a6aa10638d" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.974134 4778 scope.go:117] "RemoveContainer" containerID="0cbb671a57344b775f4ff6d3749d585e5e115edb6d8d987453203f44ff882ff0" Mar 18 09:33:30 crc kubenswrapper[4778]: I0318 09:33:30.012056 4778 scope.go:117] "RemoveContainer" containerID="8c18edac4f9dbd5b2d5ca7397e82de877b46a7b70e08421fff14ad85201dac7d" Mar 18 09:33:30 crc kubenswrapper[4778]: I0318 09:33:30.046851 4778 scope.go:117] "RemoveContainer" containerID="c1c9a9ac26d842e9f380c2bc10d90467713b713ce0c3ac97f08ea834682773ee" Mar 18 09:33:30 crc kubenswrapper[4778]: I0318 09:33:30.067665 4778 scope.go:117] "RemoveContainer" containerID="1b70618e3e5fc20f170a550bf06d195893c5a61a58727ad242d6881bbcef4e7a" Mar 18 09:33:37 crc kubenswrapper[4778]: I0318 09:33:37.044217 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-98prp"] Mar 18 09:33:37 crc kubenswrapper[4778]: I0318 09:33:37.054429 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2p9jg"] Mar 18 09:33:37 crc kubenswrapper[4778]: I0318 09:33:37.065144 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5drhw"] Mar 18 09:33:37 crc kubenswrapper[4778]: I0318 09:33:37.075260 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-98prp"] Mar 18 09:33:37 crc kubenswrapper[4778]: I0318 09:33:37.085682 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2p9jg"] Mar 18 09:33:37 crc kubenswrapper[4778]: I0318 09:33:37.097222 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5drhw"] Mar 18 09:33:38 crc kubenswrapper[4778]: I0318 09:33:38.197495 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb26926-fc81-4024-a0fa-2363d8703d72" path="/var/lib/kubelet/pods/0fb26926-fc81-4024-a0fa-2363d8703d72/volumes" Mar 18 09:33:38 crc kubenswrapper[4778]: I0318 09:33:38.199714 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" path="/var/lib/kubelet/pods/20eafe8e-c0b9-4463-bc12-8c0cd0359968/volumes" Mar 18 09:33:38 crc kubenswrapper[4778]: I0318 09:33:38.200486 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" path="/var/lib/kubelet/pods/4135fc20-df28-4f8d-b244-aedd5ed57cc2/volumes" Mar 18 09:33:40 crc kubenswrapper[4778]: I0318 09:33:40.187221 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:33:40 crc kubenswrapper[4778]: E0318 09:33:40.187958 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:33:51 crc kubenswrapper[4778]: I0318 09:33:51.049534 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jb4ss"] Mar 18 09:33:51 crc kubenswrapper[4778]: I0318 09:33:51.062163 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jb4ss"] Mar 18 09:33:52 crc kubenswrapper[4778]: I0318 09:33:52.205573 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fba399d9-71ac-41c3-912f-32ccc7fc6190" path="/var/lib/kubelet/pods/fba399d9-71ac-41c3-912f-32ccc7fc6190/volumes" Mar 18 09:33:55 crc kubenswrapper[4778]: I0318 09:33:55.187686 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:33:55 crc kubenswrapper[4778]: E0318 09:33:55.188718 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.157141 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563774-jn4n5"] Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.159217 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.162105 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.166107 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.166252 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.204344 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563774-jn4n5"] Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.309701 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-247bb\" (UniqueName: \"kubernetes.io/projected/315606e3-7197-4234-b672-400a86339d27-kube-api-access-247bb\") pod \"auto-csr-approver-29563774-jn4n5\" (UID: \"315606e3-7197-4234-b672-400a86339d27\") " pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.412288 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-247bb\" (UniqueName: \"kubernetes.io/projected/315606e3-7197-4234-b672-400a86339d27-kube-api-access-247bb\") pod \"auto-csr-approver-29563774-jn4n5\" (UID: \"315606e3-7197-4234-b672-400a86339d27\") " pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.433430 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-247bb\" (UniqueName: \"kubernetes.io/projected/315606e3-7197-4234-b672-400a86339d27-kube-api-access-247bb\") pod \"auto-csr-approver-29563774-jn4n5\" (UID: \"315606e3-7197-4234-b672-400a86339d27\") " pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.482799 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.970829 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563774-jn4n5"] Mar 18 09:34:01 crc kubenswrapper[4778]: I0318 09:34:01.718513 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" event={"ID":"315606e3-7197-4234-b672-400a86339d27","Type":"ContainerStarted","Data":"d8212f543f3dcc66d1290434aca3b32ee334c67ba0e61b2406393b9e448aa645"} Mar 18 09:34:02 crc kubenswrapper[4778]: I0318 09:34:02.735575 4778 generic.go:334] "Generic (PLEG): container finished" podID="315606e3-7197-4234-b672-400a86339d27" containerID="3e7ed49b01f49625749fbe5496f4ace13851a2f40c5fbcd9633d28b842edcbb0" exitCode=0 Mar 18 09:34:02 crc kubenswrapper[4778]: I0318 09:34:02.735697 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" event={"ID":"315606e3-7197-4234-b672-400a86339d27","Type":"ContainerDied","Data":"3e7ed49b01f49625749fbe5496f4ace13851a2f40c5fbcd9633d28b842edcbb0"} Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.129193 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.298784 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-247bb\" (UniqueName: \"kubernetes.io/projected/315606e3-7197-4234-b672-400a86339d27-kube-api-access-247bb\") pod \"315606e3-7197-4234-b672-400a86339d27\" (UID: \"315606e3-7197-4234-b672-400a86339d27\") " Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.304975 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315606e3-7197-4234-b672-400a86339d27-kube-api-access-247bb" (OuterVolumeSpecName: "kube-api-access-247bb") pod "315606e3-7197-4234-b672-400a86339d27" (UID: "315606e3-7197-4234-b672-400a86339d27"). InnerVolumeSpecName "kube-api-access-247bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.401353 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-247bb\" (UniqueName: \"kubernetes.io/projected/315606e3-7197-4234-b672-400a86339d27-kube-api-access-247bb\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.757551 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" event={"ID":"315606e3-7197-4234-b672-400a86339d27","Type":"ContainerDied","Data":"d8212f543f3dcc66d1290434aca3b32ee334c67ba0e61b2406393b9e448aa645"} Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.757607 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8212f543f3dcc66d1290434aca3b32ee334c67ba0e61b2406393b9e448aa645" Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.757690 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:05 crc kubenswrapper[4778]: I0318 09:34:05.189370 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563768-4z27c"] Mar 18 09:34:05 crc kubenswrapper[4778]: I0318 09:34:05.196085 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563768-4z27c"] Mar 18 09:34:06 crc kubenswrapper[4778]: I0318 09:34:06.202484 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec16e337-91fc-40c5-b3d4-87b5243e5a73" path="/var/lib/kubelet/pods/ec16e337-91fc-40c5-b3d4-87b5243e5a73/volumes" Mar 18 09:34:06 crc kubenswrapper[4778]: I0318 09:34:06.777578 4778 generic.go:334] "Generic (PLEG): container finished" podID="637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" containerID="e73cd1a7f2cc415ed8844ab76dbcafbf6d6af8f6453df69ed69f66ec8f314dde" exitCode=0 Mar 18 09:34:06 crc kubenswrapper[4778]: I0318 09:34:06.777702 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" event={"ID":"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1","Type":"ContainerDied","Data":"e73cd1a7f2cc415ed8844ab76dbcafbf6d6af8f6453df69ed69f66ec8f314dde"} Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.187325 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:34:08 crc kubenswrapper[4778]: E0318 09:34:08.188087 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.190610 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.378935 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-ssh-key-openstack-edpm-ipam\") pod \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.379327 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-inventory\") pod \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.379408 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z7p2\" (UniqueName: \"kubernetes.io/projected/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-kube-api-access-2z7p2\") pod \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.386825 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-kube-api-access-2z7p2" (OuterVolumeSpecName: "kube-api-access-2z7p2") pod "637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" (UID: "637a34d6-3284-4d4f-ab9b-70fd0b4d29b1"). InnerVolumeSpecName "kube-api-access-2z7p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.409828 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-inventory" (OuterVolumeSpecName: "inventory") pod "637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" (UID: "637a34d6-3284-4d4f-ab9b-70fd0b4d29b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.421468 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" (UID: "637a34d6-3284-4d4f-ab9b-70fd0b4d29b1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.481587 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z7p2\" (UniqueName: \"kubernetes.io/projected/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-kube-api-access-2z7p2\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.481615 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.481628 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.799707 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" event={"ID":"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1","Type":"ContainerDied","Data":"0ecf314c0ee29bbd7c8f5498678c43576aa8cd1ac998b8bfc3d0c895c7d8e42f"} Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.799767 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ecf314c0ee29bbd7c8f5498678c43576aa8cd1ac998b8bfc3d0c895c7d8e42f" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.799787 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.896465 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zd2d7"] Mar 18 09:34:08 crc kubenswrapper[4778]: E0318 09:34:08.897042 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.897071 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:34:08 crc kubenswrapper[4778]: E0318 09:34:08.897106 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315606e3-7197-4234-b672-400a86339d27" containerName="oc" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.897122 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="315606e3-7197-4234-b672-400a86339d27" containerName="oc" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.897427 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="315606e3-7197-4234-b672-400a86339d27" containerName="oc" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.897475 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.898430 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.900810 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.901786 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.902327 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.902809 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.907084 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zd2d7"] Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.991086 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.991269 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9662\" (UniqueName: \"kubernetes.io/projected/d85dc0a9-a7b0-4715-bc9d-974ac7657337-kube-api-access-g9662\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.991314 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.093033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9662\" (UniqueName: \"kubernetes.io/projected/d85dc0a9-a7b0-4715-bc9d-974ac7657337-kube-api-access-g9662\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.093089 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.093130 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.098189 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.099290 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.120461 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9662\" (UniqueName: \"kubernetes.io/projected/d85dc0a9-a7b0-4715-bc9d-974ac7657337-kube-api-access-g9662\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.225690 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.792643 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zd2d7"] Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.814146 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" event={"ID":"d85dc0a9-a7b0-4715-bc9d-974ac7657337","Type":"ContainerStarted","Data":"97cb197e6ba89a6ea3e936d214db02d302c4c2af7d065d843450578396131b2b"} Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.564695 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-475gr"] Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.567308 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.593550 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-475gr"] Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.754621 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-utilities\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.755551 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-catalog-content\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.755767 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvbfg\" (UniqueName: \"kubernetes.io/projected/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-kube-api-access-nvbfg\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.828298 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" event={"ID":"d85dc0a9-a7b0-4715-bc9d-974ac7657337","Type":"ContainerStarted","Data":"40ec8676e1c1ccc56a543cb5a42de5f34fb52e822133f45c8ebee2d755dab39f"} Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.846270 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" podStartSLOduration=3.072908139 podStartE2EDuration="3.846250141s" podCreationTimestamp="2026-03-18 09:34:08 +0000 UTC" firstStartedPulling="2026-03-18 09:34:09.794475837 +0000 UTC m=+1916.369220687" lastFinishedPulling="2026-03-18 09:34:10.567817829 +0000 UTC m=+1917.142562689" observedRunningTime="2026-03-18 09:34:11.842659853 +0000 UTC m=+1918.417404693" watchObservedRunningTime="2026-03-18 09:34:11.846250141 +0000 UTC m=+1918.420994991" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.857418 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-utilities\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.857750 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-catalog-content\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.857961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvbfg\" (UniqueName: \"kubernetes.io/projected/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-kube-api-access-nvbfg\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.858694 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-catalog-content\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.858877 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-utilities\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.885033 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvbfg\" (UniqueName: \"kubernetes.io/projected/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-kube-api-access-nvbfg\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.893834 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:12 crc kubenswrapper[4778]: I0318 09:34:12.423491 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-475gr"] Mar 18 09:34:12 crc kubenswrapper[4778]: W0318 09:34:12.432312 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d3beaac_46a9_4ec0_bfd5_ee225f4bb32e.slice/crio-b4fb4c9857fb657a54cf9e724f2ffd94c53e85dc57ae0bc608d2d9e727aba66f WatchSource:0}: Error finding container b4fb4c9857fb657a54cf9e724f2ffd94c53e85dc57ae0bc608d2d9e727aba66f: Status 404 returned error can't find the container with id b4fb4c9857fb657a54cf9e724f2ffd94c53e85dc57ae0bc608d2d9e727aba66f Mar 18 09:34:12 crc kubenswrapper[4778]: I0318 09:34:12.841552 4778 generic.go:334] "Generic (PLEG): container finished" podID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerID="4bac5ae2ef88828781e3d752d0712c190c833bc1c5c36fafcba65e3a7b425e6a" exitCode=0 Mar 18 09:34:12 crc kubenswrapper[4778]: I0318 09:34:12.841674 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-475gr" event={"ID":"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e","Type":"ContainerDied","Data":"4bac5ae2ef88828781e3d752d0712c190c833bc1c5c36fafcba65e3a7b425e6a"} Mar 18 09:34:12 crc kubenswrapper[4778]: I0318 09:34:12.841904 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-475gr" event={"ID":"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e","Type":"ContainerStarted","Data":"b4fb4c9857fb657a54cf9e724f2ffd94c53e85dc57ae0bc608d2d9e727aba66f"} Mar 18 09:34:14 crc kubenswrapper[4778]: I0318 09:34:14.868847 4778 generic.go:334] "Generic (PLEG): container finished" podID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerID="60f6315316fa666e5fb06d88f5b2aedb8a14343e1036e621abc9b1a882b8b80c" exitCode=0 Mar 18 09:34:14 crc kubenswrapper[4778]: I0318 09:34:14.868900 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-475gr" event={"ID":"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e","Type":"ContainerDied","Data":"60f6315316fa666e5fb06d88f5b2aedb8a14343e1036e621abc9b1a882b8b80c"} Mar 18 09:34:15 crc kubenswrapper[4778]: I0318 09:34:15.884552 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-475gr" event={"ID":"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e","Type":"ContainerStarted","Data":"7ec44f5cc97d7eda5aa101e36380c2c45ff0732b9ec9263200e6e089138517e7"} Mar 18 09:34:15 crc kubenswrapper[4778]: I0318 09:34:15.908616 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-475gr" podStartSLOduration=2.414798714 podStartE2EDuration="4.908592115s" podCreationTimestamp="2026-03-18 09:34:11 +0000 UTC" firstStartedPulling="2026-03-18 09:34:12.844321922 +0000 UTC m=+1919.419066762" lastFinishedPulling="2026-03-18 09:34:15.338115323 +0000 UTC m=+1921.912860163" observedRunningTime="2026-03-18 09:34:15.900791395 +0000 UTC m=+1922.475536245" watchObservedRunningTime="2026-03-18 09:34:15.908592115 +0000 UTC m=+1922.483336955" Mar 18 09:34:17 crc kubenswrapper[4778]: I0318 09:34:17.915075 4778 generic.go:334] "Generic (PLEG): container finished" podID="d85dc0a9-a7b0-4715-bc9d-974ac7657337" containerID="40ec8676e1c1ccc56a543cb5a42de5f34fb52e822133f45c8ebee2d755dab39f" exitCode=0 Mar 18 09:34:17 crc kubenswrapper[4778]: I0318 09:34:17.915266 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" event={"ID":"d85dc0a9-a7b0-4715-bc9d-974ac7657337","Type":"ContainerDied","Data":"40ec8676e1c1ccc56a543cb5a42de5f34fb52e822133f45c8ebee2d755dab39f"} Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.386205 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.418550 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-inventory-0\") pod \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.418620 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9662\" (UniqueName: \"kubernetes.io/projected/d85dc0a9-a7b0-4715-bc9d-974ac7657337-kube-api-access-g9662\") pod \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.418671 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-ssh-key-openstack-edpm-ipam\") pod \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.460664 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d85dc0a9-a7b0-4715-bc9d-974ac7657337-kube-api-access-g9662" (OuterVolumeSpecName: "kube-api-access-g9662") pod "d85dc0a9-a7b0-4715-bc9d-974ac7657337" (UID: "d85dc0a9-a7b0-4715-bc9d-974ac7657337"). InnerVolumeSpecName "kube-api-access-g9662". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.462618 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d85dc0a9-a7b0-4715-bc9d-974ac7657337" (UID: "d85dc0a9-a7b0-4715-bc9d-974ac7657337"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.464132 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d85dc0a9-a7b0-4715-bc9d-974ac7657337" (UID: "d85dc0a9-a7b0-4715-bc9d-974ac7657337"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.520190 4778 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.520243 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9662\" (UniqueName: \"kubernetes.io/projected/d85dc0a9-a7b0-4715-bc9d-974ac7657337-kube-api-access-g9662\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.520257 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.941023 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" event={"ID":"d85dc0a9-a7b0-4715-bc9d-974ac7657337","Type":"ContainerDied","Data":"97cb197e6ba89a6ea3e936d214db02d302c4c2af7d065d843450578396131b2b"} Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.941068 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97cb197e6ba89a6ea3e936d214db02d302c4c2af7d065d843450578396131b2b" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.941088 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.021392 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt"] Mar 18 09:34:20 crc kubenswrapper[4778]: E0318 09:34:20.021815 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d85dc0a9-a7b0-4715-bc9d-974ac7657337" containerName="ssh-known-hosts-edpm-deployment" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.021835 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85dc0a9-a7b0-4715-bc9d-974ac7657337" containerName="ssh-known-hosts-edpm-deployment" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.022058 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d85dc0a9-a7b0-4715-bc9d-974ac7657337" containerName="ssh-known-hosts-edpm-deployment" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.022768 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.026401 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.026491 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.026868 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.028104 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.039021 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt"] Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.129792 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdcqp\" (UniqueName: \"kubernetes.io/projected/0b7ef620-077d-4a38-90ed-fed05ccba5d2-kube-api-access-vdcqp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.129901 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.129976 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.231126 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdcqp\" (UniqueName: \"kubernetes.io/projected/0b7ef620-077d-4a38-90ed-fed05ccba5d2-kube-api-access-vdcqp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.231228 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.231299 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.244260 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.244621 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.250228 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdcqp\" (UniqueName: \"kubernetes.io/projected/0b7ef620-077d-4a38-90ed-fed05ccba5d2-kube-api-access-vdcqp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.343528 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.924905 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt"] Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.953030 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" event={"ID":"0b7ef620-077d-4a38-90ed-fed05ccba5d2","Type":"ContainerStarted","Data":"8822c7df21f30bb6af9e1f11ae0a03462e98cfd9b7aec393cc5e0a76ed0fa331"} Mar 18 09:34:21 crc kubenswrapper[4778]: I0318 09:34:21.894156 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:21 crc kubenswrapper[4778]: I0318 09:34:21.894533 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:21 crc kubenswrapper[4778]: I0318 09:34:21.948844 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:21 crc kubenswrapper[4778]: I0318 09:34:21.968926 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" event={"ID":"0b7ef620-077d-4a38-90ed-fed05ccba5d2","Type":"ContainerStarted","Data":"16287400dde13e1c43e43b5d82fb77152a697e54c70364b3f589e427eeb0ea66"} Mar 18 09:34:22 crc kubenswrapper[4778]: I0318 09:34:22.015354 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" podStartSLOduration=1.57121584 podStartE2EDuration="2.015336214s" podCreationTimestamp="2026-03-18 09:34:20 +0000 UTC" firstStartedPulling="2026-03-18 09:34:20.937302217 +0000 UTC m=+1927.512047057" lastFinishedPulling="2026-03-18 09:34:21.381422581 +0000 UTC m=+1927.956167431" observedRunningTime="2026-03-18 09:34:22.008435138 +0000 UTC m=+1928.583179998" watchObservedRunningTime="2026-03-18 09:34:22.015336214 +0000 UTC m=+1928.590081064" Mar 18 09:34:22 crc kubenswrapper[4778]: I0318 09:34:22.032113 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:22 crc kubenswrapper[4778]: I0318 09:34:22.188058 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:34:22 crc kubenswrapper[4778]: E0318 09:34:22.188381 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:34:24 crc kubenswrapper[4778]: I0318 09:34:24.318172 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-475gr"] Mar 18 09:34:24 crc kubenswrapper[4778]: I0318 09:34:24.319108 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-475gr" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="registry-server" containerID="cri-o://7ec44f5cc97d7eda5aa101e36380c2c45ff0732b9ec9263200e6e089138517e7" gracePeriod=2 Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.003594 4778 generic.go:334] "Generic (PLEG): container finished" podID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerID="7ec44f5cc97d7eda5aa101e36380c2c45ff0732b9ec9263200e6e089138517e7" exitCode=0 Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.003867 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-475gr" event={"ID":"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e","Type":"ContainerDied","Data":"7ec44f5cc97d7eda5aa101e36380c2c45ff0732b9ec9263200e6e089138517e7"} Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.271578 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.340516 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvbfg\" (UniqueName: \"kubernetes.io/projected/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-kube-api-access-nvbfg\") pod \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.340640 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-utilities\") pod \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.340669 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-catalog-content\") pod \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.341894 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-utilities" (OuterVolumeSpecName: "utilities") pod "6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" (UID: "6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.346679 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-kube-api-access-nvbfg" (OuterVolumeSpecName: "kube-api-access-nvbfg") pod "6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" (UID: "6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e"). InnerVolumeSpecName "kube-api-access-nvbfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.393627 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" (UID: "6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.441949 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.441979 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.441990 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvbfg\" (UniqueName: \"kubernetes.io/projected/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-kube-api-access-nvbfg\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.015970 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-475gr" event={"ID":"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e","Type":"ContainerDied","Data":"b4fb4c9857fb657a54cf9e724f2ffd94c53e85dc57ae0bc608d2d9e727aba66f"} Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.016281 4778 scope.go:117] "RemoveContainer" containerID="7ec44f5cc97d7eda5aa101e36380c2c45ff0732b9ec9263200e6e089138517e7" Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.016040 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.035160 4778 scope.go:117] "RemoveContainer" containerID="60f6315316fa666e5fb06d88f5b2aedb8a14343e1036e621abc9b1a882b8b80c" Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.055429 4778 scope.go:117] "RemoveContainer" containerID="4bac5ae2ef88828781e3d752d0712c190c833bc1c5c36fafcba65e3a7b425e6a" Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.059161 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-475gr"] Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.071350 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-475gr"] Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.200841 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" path="/var/lib/kubelet/pods/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e/volumes" Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.057341 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" event={"ID":"0b7ef620-077d-4a38-90ed-fed05ccba5d2","Type":"ContainerDied","Data":"16287400dde13e1c43e43b5d82fb77152a697e54c70364b3f589e427eeb0ea66"} Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.057326 4778 generic.go:334] "Generic (PLEG): container finished" podID="0b7ef620-077d-4a38-90ed-fed05ccba5d2" containerID="16287400dde13e1c43e43b5d82fb77152a697e54c70364b3f589e427eeb0ea66" exitCode=0 Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.239285 4778 scope.go:117] "RemoveContainer" containerID="201dd8b3293289bdbf9f29c3749f98499b07694d8d80e9df99ed62c3075ec93f" Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.303490 4778 scope.go:117] "RemoveContainer" containerID="a9c5d9789a0793c932c41c72d2058c14c6dc506dbd6a4bb8ed76c0353ce8bcc2" Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.351839 4778 scope.go:117] "RemoveContainer" containerID="b5325ab7bf5fcc801abe0c67c554c5ab72e440e7503aafe03c45a398c6a12432" Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.441679 4778 scope.go:117] "RemoveContainer" containerID="43cd9dd19c1bcb6dd251052b9f5e2f4fd14b11fc891320649bdbfdb44d9ca171" Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.471598 4778 scope.go:117] "RemoveContainer" containerID="f3f45a8c98da9f26aefaed6f12fcf1ccfeb1c540f04357427bf8f13c5c12ad79" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.588414 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.655972 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdcqp\" (UniqueName: \"kubernetes.io/projected/0b7ef620-077d-4a38-90ed-fed05ccba5d2-kube-api-access-vdcqp\") pod \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.656127 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-inventory\") pod \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.656340 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-ssh-key-openstack-edpm-ipam\") pod \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.677515 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7ef620-077d-4a38-90ed-fed05ccba5d2-kube-api-access-vdcqp" (OuterVolumeSpecName: "kube-api-access-vdcqp") pod "0b7ef620-077d-4a38-90ed-fed05ccba5d2" (UID: "0b7ef620-077d-4a38-90ed-fed05ccba5d2"). InnerVolumeSpecName "kube-api-access-vdcqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.688356 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-inventory" (OuterVolumeSpecName: "inventory") pod "0b7ef620-077d-4a38-90ed-fed05ccba5d2" (UID: "0b7ef620-077d-4a38-90ed-fed05ccba5d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.695474 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0b7ef620-077d-4a38-90ed-fed05ccba5d2" (UID: "0b7ef620-077d-4a38-90ed-fed05ccba5d2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.758778 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.758831 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdcqp\" (UniqueName: \"kubernetes.io/projected/0b7ef620-077d-4a38-90ed-fed05ccba5d2-kube-api-access-vdcqp\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.758842 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.086525 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" event={"ID":"0b7ef620-077d-4a38-90ed-fed05ccba5d2","Type":"ContainerDied","Data":"8822c7df21f30bb6af9e1f11ae0a03462e98cfd9b7aec393cc5e0a76ed0fa331"} Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.086650 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8822c7df21f30bb6af9e1f11ae0a03462e98cfd9b7aec393cc5e0a76ed0fa331" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.086968 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.235638 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2"] Mar 18 09:34:32 crc kubenswrapper[4778]: E0318 09:34:32.236051 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7ef620-077d-4a38-90ed-fed05ccba5d2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236068 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7ef620-077d-4a38-90ed-fed05ccba5d2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:34:32 crc kubenswrapper[4778]: E0318 09:34:32.236085 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="extract-content" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236091 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="extract-content" Mar 18 09:34:32 crc kubenswrapper[4778]: E0318 09:34:32.236110 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="registry-server" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236116 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="registry-server" Mar 18 09:34:32 crc kubenswrapper[4778]: E0318 09:34:32.236136 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="extract-utilities" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236143 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="extract-utilities" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236328 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7ef620-077d-4a38-90ed-fed05ccba5d2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236341 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="registry-server" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236938 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.239013 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.239234 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.239722 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.241680 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.245048 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2"] Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.371917 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79w68\" (UniqueName: \"kubernetes.io/projected/bf3315b0-1ace-422a-8049-3fd13fe46e65-kube-api-access-79w68\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.372138 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.372241 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.474752 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.474845 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.474941 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79w68\" (UniqueName: \"kubernetes.io/projected/bf3315b0-1ace-422a-8049-3fd13fe46e65-kube-api-access-79w68\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.480090 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.480666 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.497461 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79w68\" (UniqueName: \"kubernetes.io/projected/bf3315b0-1ace-422a-8049-3fd13fe46e65-kube-api-access-79w68\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.553008 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:33 crc kubenswrapper[4778]: I0318 09:34:33.158782 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2"] Mar 18 09:34:34 crc kubenswrapper[4778]: I0318 09:34:34.106473 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" event={"ID":"bf3315b0-1ace-422a-8049-3fd13fe46e65","Type":"ContainerStarted","Data":"ae502ee49eb38287e9aaaa3ab0077cb1ef93b09b02a84b50a76e8fa209d6ad0c"} Mar 18 09:34:34 crc kubenswrapper[4778]: I0318 09:34:34.106595 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" event={"ID":"bf3315b0-1ace-422a-8049-3fd13fe46e65","Type":"ContainerStarted","Data":"b6b8f26a067a2e81c9bf97aaee67a21d858c099d81a194c710b247b4bbc30b19"} Mar 18 09:34:34 crc kubenswrapper[4778]: I0318 09:34:34.136985 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" podStartSLOduration=1.7060892669999999 podStartE2EDuration="2.136958715s" podCreationTimestamp="2026-03-18 09:34:32 +0000 UTC" firstStartedPulling="2026-03-18 09:34:33.157267619 +0000 UTC m=+1939.732012459" lastFinishedPulling="2026-03-18 09:34:33.588137057 +0000 UTC m=+1940.162881907" observedRunningTime="2026-03-18 09:34:34.128798665 +0000 UTC m=+1940.703543535" watchObservedRunningTime="2026-03-18 09:34:34.136958715 +0000 UTC m=+1940.711703595" Mar 18 09:34:35 crc kubenswrapper[4778]: I0318 09:34:35.188804 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:34:36 crc kubenswrapper[4778]: I0318 09:34:36.133414 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"0190a17d7c9066ee03765200824080dd651faade8ac0a106f1878805db93f225"} Mar 18 09:34:42 crc kubenswrapper[4778]: I0318 09:34:42.062314 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-nl2dg"] Mar 18 09:34:42 crc kubenswrapper[4778]: I0318 09:34:42.074669 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-nl2dg"] Mar 18 09:34:42 crc kubenswrapper[4778]: I0318 09:34:42.209495 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8341ceba-13e0-410f-a7d2-23190a07d914" path="/var/lib/kubelet/pods/8341ceba-13e0-410f-a7d2-23190a07d914/volumes" Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.068610 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-922c-account-create-update-6z2xf"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.087719 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9hlqk"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.104214 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-t5x58"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.113736 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9hlqk"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.124103 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-722f-account-create-update-slwd5"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.133885 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-922c-account-create-update-6z2xf"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.141291 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b9f9-account-create-update-w7q2n"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.152894 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-t5x58"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.162130 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-722f-account-create-update-slwd5"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.171253 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b9f9-account-create-update-w7q2n"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.200108 4778 generic.go:334] "Generic (PLEG): container finished" podID="bf3315b0-1ace-422a-8049-3fd13fe46e65" containerID="ae502ee49eb38287e9aaaa3ab0077cb1ef93b09b02a84b50a76e8fa209d6ad0c" exitCode=0 Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.200138 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" event={"ID":"bf3315b0-1ace-422a-8049-3fd13fe46e65","Type":"ContainerDied","Data":"ae502ee49eb38287e9aaaa3ab0077cb1ef93b09b02a84b50a76e8fa209d6ad0c"} Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.224374 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f06b776-36bc-45ba-88d4-69608f9665e6" path="/var/lib/kubelet/pods/2f06b776-36bc-45ba-88d4-69608f9665e6/volumes" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.227490 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" path="/var/lib/kubelet/pods/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d/volumes" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.228292 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92444732-2d3e-4065-a336-74b37b711530" path="/var/lib/kubelet/pods/92444732-2d3e-4065-a336-74b37b711530/volumes" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.228831 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b380dfb3-b55b-4db2-bd8f-a90b4470345d" path="/var/lib/kubelet/pods/b380dfb3-b55b-4db2-bd8f-a90b4470345d/volumes" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.229841 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" path="/var/lib/kubelet/pods/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e/volumes" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.609457 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.687358 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-ssh-key-openstack-edpm-ipam\") pod \"bf3315b0-1ace-422a-8049-3fd13fe46e65\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.687555 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79w68\" (UniqueName: \"kubernetes.io/projected/bf3315b0-1ace-422a-8049-3fd13fe46e65-kube-api-access-79w68\") pod \"bf3315b0-1ace-422a-8049-3fd13fe46e65\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.687614 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-inventory\") pod \"bf3315b0-1ace-422a-8049-3fd13fe46e65\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.699953 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3315b0-1ace-422a-8049-3fd13fe46e65-kube-api-access-79w68" (OuterVolumeSpecName: "kube-api-access-79w68") pod "bf3315b0-1ace-422a-8049-3fd13fe46e65" (UID: "bf3315b0-1ace-422a-8049-3fd13fe46e65"). InnerVolumeSpecName "kube-api-access-79w68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.715181 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bf3315b0-1ace-422a-8049-3fd13fe46e65" (UID: "bf3315b0-1ace-422a-8049-3fd13fe46e65"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.723189 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-inventory" (OuterVolumeSpecName: "inventory") pod "bf3315b0-1ace-422a-8049-3fd13fe46e65" (UID: "bf3315b0-1ace-422a-8049-3fd13fe46e65"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.789640 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79w68\" (UniqueName: \"kubernetes.io/projected/bf3315b0-1ace-422a-8049-3fd13fe46e65-kube-api-access-79w68\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.789677 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.789690 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:45 crc kubenswrapper[4778]: I0318 09:34:45.239629 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" event={"ID":"bf3315b0-1ace-422a-8049-3fd13fe46e65","Type":"ContainerDied","Data":"b6b8f26a067a2e81c9bf97aaee67a21d858c099d81a194c710b247b4bbc30b19"} Mar 18 09:34:45 crc kubenswrapper[4778]: I0318 09:34:45.240135 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b8f26a067a2e81c9bf97aaee67a21d858c099d81a194c710b247b4bbc30b19" Mar 18 09:34:45 crc kubenswrapper[4778]: I0318 09:34:45.239927 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:35:12 crc kubenswrapper[4778]: I0318 09:35:12.036623 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vhxwz"] Mar 18 09:35:12 crc kubenswrapper[4778]: I0318 09:35:12.047662 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vhxwz"] Mar 18 09:35:12 crc kubenswrapper[4778]: I0318 09:35:12.202366 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8348daa3-112d-49f7-93d8-3649ebf10eee" path="/var/lib/kubelet/pods/8348daa3-112d-49f7-93d8-3649ebf10eee/volumes" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.657375 4778 scope.go:117] "RemoveContainer" containerID="b0ad59dfbfbe8f98b2a7024fc11350f06ab712f37850bffb7121c440c9344960" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.699059 4778 scope.go:117] "RemoveContainer" containerID="aed5c2d54c93258cf5753b658cc8a1430cb39fb4faca41384d49dcc12f51df37" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.764310 4778 scope.go:117] "RemoveContainer" containerID="9866f0cece8384eb6d69125fd4f2648001a15f8207d97598a6f6b380c668253f" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.783503 4778 scope.go:117] "RemoveContainer" containerID="3cc34e35f2db07df2220b6c334d24c112405b578f89727d873a592536bc78998" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.832227 4778 scope.go:117] "RemoveContainer" containerID="47bfce503465075386d4ab81517eb08824a50d2ca76a4ab55639a7aea5948d36" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.871983 4778 scope.go:117] "RemoveContainer" containerID="c118c28760c4816bb842a36e485ff938333b6ae9902cf9242267aa191e3d70bf" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.908611 4778 scope.go:117] "RemoveContainer" containerID="7fb36f99fa48f9c60dbdcb8445fed2d769e9cb712ffc10c71b7ff46632229d69" Mar 18 09:35:34 crc kubenswrapper[4778]: I0318 09:35:34.051556 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mjs29"] Mar 18 09:35:34 crc kubenswrapper[4778]: I0318 09:35:34.059277 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mjs29"] Mar 18 09:35:34 crc kubenswrapper[4778]: I0318 09:35:34.198385 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b89439e3-a138-4aa8-98a4-2e23ce3819e0" path="/var/lib/kubelet/pods/b89439e3-a138-4aa8-98a4-2e23ce3819e0/volumes" Mar 18 09:35:35 crc kubenswrapper[4778]: I0318 09:35:35.082810 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jbjb9"] Mar 18 09:35:35 crc kubenswrapper[4778]: I0318 09:35:35.107443 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jbjb9"] Mar 18 09:35:36 crc kubenswrapper[4778]: I0318 09:35:36.198917 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85d64a6-99af-4b66-9a60-cd6a046af840" path="/var/lib/kubelet/pods/e85d64a6-99af-4b66-9a60-cd6a046af840/volumes" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.156960 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563776-bxwcm"] Mar 18 09:36:00 crc kubenswrapper[4778]: E0318 09:36:00.157965 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3315b0-1ace-422a-8049-3fd13fe46e65" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.157979 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3315b0-1ace-422a-8049-3fd13fe46e65" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.158166 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3315b0-1ace-422a-8049-3fd13fe46e65" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.158923 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.162805 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.162896 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.164184 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.164879 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563776-bxwcm"] Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.344977 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbr95\" (UniqueName: \"kubernetes.io/projected/b14b14c0-2e4e-420d-bdba-234de9130e4a-kube-api-access-bbr95\") pod \"auto-csr-approver-29563776-bxwcm\" (UID: \"b14b14c0-2e4e-420d-bdba-234de9130e4a\") " pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.446954 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbr95\" (UniqueName: \"kubernetes.io/projected/b14b14c0-2e4e-420d-bdba-234de9130e4a-kube-api-access-bbr95\") pod \"auto-csr-approver-29563776-bxwcm\" (UID: \"b14b14c0-2e4e-420d-bdba-234de9130e4a\") " pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.482689 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbr95\" (UniqueName: \"kubernetes.io/projected/b14b14c0-2e4e-420d-bdba-234de9130e4a-kube-api-access-bbr95\") pod \"auto-csr-approver-29563776-bxwcm\" (UID: \"b14b14c0-2e4e-420d-bdba-234de9130e4a\") " pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.781826 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:01 crc kubenswrapper[4778]: I0318 09:36:01.244998 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563776-bxwcm"] Mar 18 09:36:02 crc kubenswrapper[4778]: I0318 09:36:02.097292 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" event={"ID":"b14b14c0-2e4e-420d-bdba-234de9130e4a","Type":"ContainerStarted","Data":"b19d1bc12e03cd0b9cf24c232aaef04916608d6f63edcf7eb54f530ee75b9ccd"} Mar 18 09:36:04 crc kubenswrapper[4778]: I0318 09:36:04.123436 4778 generic.go:334] "Generic (PLEG): container finished" podID="b14b14c0-2e4e-420d-bdba-234de9130e4a" containerID="037bb0f9fdf9935b25af1bbd8db6391c200ce1a888406ad48350f6fbf2f0253c" exitCode=0 Mar 18 09:36:04 crc kubenswrapper[4778]: I0318 09:36:04.123523 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" event={"ID":"b14b14c0-2e4e-420d-bdba-234de9130e4a","Type":"ContainerDied","Data":"037bb0f9fdf9935b25af1bbd8db6391c200ce1a888406ad48350f6fbf2f0253c"} Mar 18 09:36:05 crc kubenswrapper[4778]: I0318 09:36:05.555871 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:05 crc kubenswrapper[4778]: I0318 09:36:05.751281 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbr95\" (UniqueName: \"kubernetes.io/projected/b14b14c0-2e4e-420d-bdba-234de9130e4a-kube-api-access-bbr95\") pod \"b14b14c0-2e4e-420d-bdba-234de9130e4a\" (UID: \"b14b14c0-2e4e-420d-bdba-234de9130e4a\") " Mar 18 09:36:05 crc kubenswrapper[4778]: I0318 09:36:05.757942 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b14b14c0-2e4e-420d-bdba-234de9130e4a-kube-api-access-bbr95" (OuterVolumeSpecName: "kube-api-access-bbr95") pod "b14b14c0-2e4e-420d-bdba-234de9130e4a" (UID: "b14b14c0-2e4e-420d-bdba-234de9130e4a"). InnerVolumeSpecName "kube-api-access-bbr95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:36:05 crc kubenswrapper[4778]: I0318 09:36:05.853524 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbr95\" (UniqueName: \"kubernetes.io/projected/b14b14c0-2e4e-420d-bdba-234de9130e4a-kube-api-access-bbr95\") on node \"crc\" DevicePath \"\"" Mar 18 09:36:06 crc kubenswrapper[4778]: I0318 09:36:06.144731 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" event={"ID":"b14b14c0-2e4e-420d-bdba-234de9130e4a","Type":"ContainerDied","Data":"b19d1bc12e03cd0b9cf24c232aaef04916608d6f63edcf7eb54f530ee75b9ccd"} Mar 18 09:36:06 crc kubenswrapper[4778]: I0318 09:36:06.145024 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b19d1bc12e03cd0b9cf24c232aaef04916608d6f63edcf7eb54f530ee75b9ccd" Mar 18 09:36:06 crc kubenswrapper[4778]: I0318 09:36:06.144781 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:06 crc kubenswrapper[4778]: I0318 09:36:06.695788 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563770-5f9th"] Mar 18 09:36:06 crc kubenswrapper[4778]: I0318 09:36:06.705254 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563770-5f9th"] Mar 18 09:36:08 crc kubenswrapper[4778]: I0318 09:36:08.201387 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5229065-e84e-4d42-870f-1ee468bff359" path="/var/lib/kubelet/pods/d5229065-e84e-4d42-870f-1ee468bff359/volumes" Mar 18 09:36:20 crc kubenswrapper[4778]: I0318 09:36:20.047534 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-2nml6"] Mar 18 09:36:20 crc kubenswrapper[4778]: I0318 09:36:20.060328 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-2nml6"] Mar 18 09:36:20 crc kubenswrapper[4778]: I0318 09:36:20.200858 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32eb800e-69e8-4e39-ae5b-74a5eec87b00" path="/var/lib/kubelet/pods/32eb800e-69e8-4e39-ae5b-74a5eec87b00/volumes" Mar 18 09:36:31 crc kubenswrapper[4778]: I0318 09:36:31.045810 4778 scope.go:117] "RemoveContainer" containerID="973e9f8f665d67a226625f9e044e9c18b31cbfecdc6ee8dcf02562081d63ced4" Mar 18 09:36:31 crc kubenswrapper[4778]: I0318 09:36:31.107611 4778 scope.go:117] "RemoveContainer" containerID="8faf9c7a656879007008e10d6b7f5d22a002ddd8fac9065c9f561e0e336487fd" Mar 18 09:36:31 crc kubenswrapper[4778]: I0318 09:36:31.163701 4778 scope.go:117] "RemoveContainer" containerID="3858148ddf213daa44ce8f206664d3360023f6d6f91e24bcfce11a24c0f0213c" Mar 18 09:36:31 crc kubenswrapper[4778]: I0318 09:36:31.219935 4778 scope.go:117] "RemoveContainer" containerID="462bde6149a0c02e0a81e9d8cf7097470bbd3546789cdc6d2d61c3177437187e" Mar 18 09:37:00 crc kubenswrapper[4778]: I0318 09:37:00.147667 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:37:00 crc kubenswrapper[4778]: I0318 09:37:00.149553 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:37:30 crc kubenswrapper[4778]: I0318 09:37:30.148174 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:37:30 crc kubenswrapper[4778]: I0318 09:37:30.149089 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.352652 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dkw"] Mar 18 09:37:49 crc kubenswrapper[4778]: E0318 09:37:49.354132 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b14b14c0-2e4e-420d-bdba-234de9130e4a" containerName="oc" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.354269 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14b14c0-2e4e-420d-bdba-234de9130e4a" containerName="oc" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.354485 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b14b14c0-2e4e-420d-bdba-234de9130e4a" containerName="oc" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.356260 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.367265 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dkw"] Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.475312 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-utilities\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.475415 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-catalog-content\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.475507 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5f78\" (UniqueName: \"kubernetes.io/projected/eea5e508-8702-4085-b99d-43524ffd7dac-kube-api-access-c5f78\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.577444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5f78\" (UniqueName: \"kubernetes.io/projected/eea5e508-8702-4085-b99d-43524ffd7dac-kube-api-access-c5f78\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.577521 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-utilities\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.577578 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-catalog-content\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.577995 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-catalog-content\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.578170 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-utilities\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.599513 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5f78\" (UniqueName: \"kubernetes.io/projected/eea5e508-8702-4085-b99d-43524ffd7dac-kube-api-access-c5f78\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.703159 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:50 crc kubenswrapper[4778]: I0318 09:37:50.155003 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dkw"] Mar 18 09:37:50 crc kubenswrapper[4778]: I0318 09:37:50.773857 4778 generic.go:334] "Generic (PLEG): container finished" podID="eea5e508-8702-4085-b99d-43524ffd7dac" containerID="d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee" exitCode=0 Mar 18 09:37:50 crc kubenswrapper[4778]: I0318 09:37:50.774027 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerDied","Data":"d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee"} Mar 18 09:37:50 crc kubenswrapper[4778]: I0318 09:37:50.774245 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerStarted","Data":"e10dc855312d99da1bc5c5f66a6b1da1b05bc4c76ebbe2385a7e6ad29264fe63"} Mar 18 09:37:50 crc kubenswrapper[4778]: I0318 09:37:50.776877 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:37:51 crc kubenswrapper[4778]: I0318 09:37:51.783584 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerStarted","Data":"ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45"} Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.363358 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6v22l"] Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.366281 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.394279 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6v22l"] Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.434641 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-utilities\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.434852 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-catalog-content\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.435179 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmv8p\" (UniqueName: \"kubernetes.io/projected/4d79e7e2-9db4-4307-8411-18b74d60b1b7-kube-api-access-zmv8p\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.536525 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-utilities\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.536624 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-catalog-content\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.536736 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmv8p\" (UniqueName: \"kubernetes.io/projected/4d79e7e2-9db4-4307-8411-18b74d60b1b7-kube-api-access-zmv8p\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.537640 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-utilities\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.537919 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-catalog-content\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.563589 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmv8p\" (UniqueName: \"kubernetes.io/projected/4d79e7e2-9db4-4307-8411-18b74d60b1b7-kube-api-access-zmv8p\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.690054 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.814294 4778 generic.go:334] "Generic (PLEG): container finished" podID="eea5e508-8702-4085-b99d-43524ffd7dac" containerID="ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45" exitCode=0 Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.814341 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerDied","Data":"ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45"} Mar 18 09:37:53 crc kubenswrapper[4778]: I0318 09:37:53.249592 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6v22l"] Mar 18 09:37:53 crc kubenswrapper[4778]: W0318 09:37:53.261756 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d79e7e2_9db4_4307_8411_18b74d60b1b7.slice/crio-19c45c0d8cb539fb9cbc49d484d0d217fa556329caaf0c5f55191f47880349c3 WatchSource:0}: Error finding container 19c45c0d8cb539fb9cbc49d484d0d217fa556329caaf0c5f55191f47880349c3: Status 404 returned error can't find the container with id 19c45c0d8cb539fb9cbc49d484d0d217fa556329caaf0c5f55191f47880349c3 Mar 18 09:37:53 crc kubenswrapper[4778]: I0318 09:37:53.833350 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerStarted","Data":"a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1"} Mar 18 09:37:53 crc kubenswrapper[4778]: I0318 09:37:53.836562 4778 generic.go:334] "Generic (PLEG): container finished" podID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerID="4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d" exitCode=0 Mar 18 09:37:53 crc kubenswrapper[4778]: I0318 09:37:53.836595 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerDied","Data":"4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d"} Mar 18 09:37:53 crc kubenswrapper[4778]: I0318 09:37:53.836615 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerStarted","Data":"19c45c0d8cb539fb9cbc49d484d0d217fa556329caaf0c5f55191f47880349c3"} Mar 18 09:37:53 crc kubenswrapper[4778]: I0318 09:37:53.870936 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k7dkw" podStartSLOduration=2.324533106 podStartE2EDuration="4.870909439s" podCreationTimestamp="2026-03-18 09:37:49 +0000 UTC" firstStartedPulling="2026-03-18 09:37:50.776417273 +0000 UTC m=+2137.351162113" lastFinishedPulling="2026-03-18 09:37:53.322793606 +0000 UTC m=+2139.897538446" observedRunningTime="2026-03-18 09:37:53.857743042 +0000 UTC m=+2140.432487932" watchObservedRunningTime="2026-03-18 09:37:53.870909439 +0000 UTC m=+2140.445654299" Mar 18 09:37:54 crc kubenswrapper[4778]: I0318 09:37:54.851167 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerStarted","Data":"5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708"} Mar 18 09:37:55 crc kubenswrapper[4778]: I0318 09:37:55.862659 4778 generic.go:334] "Generic (PLEG): container finished" podID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerID="5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708" exitCode=0 Mar 18 09:37:55 crc kubenswrapper[4778]: I0318 09:37:55.862763 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerDied","Data":"5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708"} Mar 18 09:37:56 crc kubenswrapper[4778]: I0318 09:37:56.878367 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerStarted","Data":"68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940"} Mar 18 09:37:56 crc kubenswrapper[4778]: I0318 09:37:56.914082 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6v22l" podStartSLOduration=2.463222631 podStartE2EDuration="4.914052181s" podCreationTimestamp="2026-03-18 09:37:52 +0000 UTC" firstStartedPulling="2026-03-18 09:37:53.839064216 +0000 UTC m=+2140.413809066" lastFinishedPulling="2026-03-18 09:37:56.289893766 +0000 UTC m=+2142.864638616" observedRunningTime="2026-03-18 09:37:56.903105834 +0000 UTC m=+2143.477850744" watchObservedRunningTime="2026-03-18 09:37:56.914052181 +0000 UTC m=+2143.488797051" Mar 18 09:37:59 crc kubenswrapper[4778]: I0318 09:37:59.704168 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:59 crc kubenswrapper[4778]: I0318 09:37:59.704263 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:59 crc kubenswrapper[4778]: I0318 09:37:59.783394 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:59 crc kubenswrapper[4778]: I0318 09:37:59.971163 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.147970 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563778-4dm5p"] Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.148420 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.148596 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.149354 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.149740 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.150082 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0190a17d7c9066ee03765200824080dd651faade8ac0a106f1878805db93f225"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.150144 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://0190a17d7c9066ee03765200824080dd651faade8ac0a106f1878805db93f225" gracePeriod=600 Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.156963 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.157123 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.157389 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.164286 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563778-4dm5p"] Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.284553 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49pq9\" (UniqueName: \"kubernetes.io/projected/0548485b-4f03-47ba-8a13-4e3522451291-kube-api-access-49pq9\") pod \"auto-csr-approver-29563778-4dm5p\" (UID: \"0548485b-4f03-47ba-8a13-4e3522451291\") " pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.389534 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49pq9\" (UniqueName: \"kubernetes.io/projected/0548485b-4f03-47ba-8a13-4e3522451291-kube-api-access-49pq9\") pod \"auto-csr-approver-29563778-4dm5p\" (UID: \"0548485b-4f03-47ba-8a13-4e3522451291\") " pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.428066 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49pq9\" (UniqueName: \"kubernetes.io/projected/0548485b-4f03-47ba-8a13-4e3522451291-kube-api-access-49pq9\") pod \"auto-csr-approver-29563778-4dm5p\" (UID: \"0548485b-4f03-47ba-8a13-4e3522451291\") " pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.472814 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.777231 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563778-4dm5p"] Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.916940 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="0190a17d7c9066ee03765200824080dd651faade8ac0a106f1878805db93f225" exitCode=0 Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.917003 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"0190a17d7c9066ee03765200824080dd651faade8ac0a106f1878805db93f225"} Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.917042 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.919743 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" event={"ID":"0548485b-4f03-47ba-8a13-4e3522451291","Type":"ContainerStarted","Data":"8465549329bfd92b52a0d5ceef063d0f55bdc478460f4cbdf9d413424cbc4cc5"} Mar 18 09:38:01 crc kubenswrapper[4778]: I0318 09:38:01.139683 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dkw"] Mar 18 09:38:01 crc kubenswrapper[4778]: I0318 09:38:01.934586 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9"} Mar 18 09:38:01 crc kubenswrapper[4778]: I0318 09:38:01.934726 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k7dkw" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="registry-server" containerID="cri-o://a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1" gracePeriod=2 Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.397262 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.531343 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-utilities\") pod \"eea5e508-8702-4085-b99d-43524ffd7dac\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.531510 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5f78\" (UniqueName: \"kubernetes.io/projected/eea5e508-8702-4085-b99d-43524ffd7dac-kube-api-access-c5f78\") pod \"eea5e508-8702-4085-b99d-43524ffd7dac\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.531566 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-catalog-content\") pod \"eea5e508-8702-4085-b99d-43524ffd7dac\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.533594 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-utilities" (OuterVolumeSpecName: "utilities") pod "eea5e508-8702-4085-b99d-43524ffd7dac" (UID: "eea5e508-8702-4085-b99d-43524ffd7dac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.540743 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea5e508-8702-4085-b99d-43524ffd7dac-kube-api-access-c5f78" (OuterVolumeSpecName: "kube-api-access-c5f78") pod "eea5e508-8702-4085-b99d-43524ffd7dac" (UID: "eea5e508-8702-4085-b99d-43524ffd7dac"). InnerVolumeSpecName "kube-api-access-c5f78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.584724 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eea5e508-8702-4085-b99d-43524ffd7dac" (UID: "eea5e508-8702-4085-b99d-43524ffd7dac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.633740 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.633785 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5f78\" (UniqueName: \"kubernetes.io/projected/eea5e508-8702-4085-b99d-43524ffd7dac-kube-api-access-c5f78\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.633797 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.691136 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.691439 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.749774 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.952272 4778 generic.go:334] "Generic (PLEG): container finished" podID="0548485b-4f03-47ba-8a13-4e3522451291" containerID="f6adcd9d5f24124681eed0d00263f7ac4a19be40ad724c067b9849cb1ce141e4" exitCode=0 Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.953288 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" event={"ID":"0548485b-4f03-47ba-8a13-4e3522451291","Type":"ContainerDied","Data":"f6adcd9d5f24124681eed0d00263f7ac4a19be40ad724c067b9849cb1ce141e4"} Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.956427 4778 generic.go:334] "Generic (PLEG): container finished" podID="eea5e508-8702-4085-b99d-43524ffd7dac" containerID="a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1" exitCode=0 Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.957595 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.962570 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerDied","Data":"a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1"} Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.962644 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerDied","Data":"e10dc855312d99da1bc5c5f66a6b1da1b05bc4c76ebbe2385a7e6ad29264fe63"} Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.962675 4778 scope.go:117] "RemoveContainer" containerID="a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.000837 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dkw"] Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.007138 4778 scope.go:117] "RemoveContainer" containerID="ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.011280 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dkw"] Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.011612 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.034050 4778 scope.go:117] "RemoveContainer" containerID="d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.067322 4778 scope.go:117] "RemoveContainer" containerID="a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1" Mar 18 09:38:03 crc kubenswrapper[4778]: E0318 09:38:03.067716 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1\": container with ID starting with a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1 not found: ID does not exist" containerID="a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.067828 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1"} err="failed to get container status \"a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1\": rpc error: code = NotFound desc = could not find container \"a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1\": container with ID starting with a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1 not found: ID does not exist" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.067909 4778 scope.go:117] "RemoveContainer" containerID="ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45" Mar 18 09:38:03 crc kubenswrapper[4778]: E0318 09:38:03.068222 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45\": container with ID starting with ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45 not found: ID does not exist" containerID="ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.068412 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45"} err="failed to get container status \"ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45\": rpc error: code = NotFound desc = could not find container \"ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45\": container with ID starting with ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45 not found: ID does not exist" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.068489 4778 scope.go:117] "RemoveContainer" containerID="d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee" Mar 18 09:38:03 crc kubenswrapper[4778]: E0318 09:38:03.068714 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee\": container with ID starting with d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee not found: ID does not exist" containerID="d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.068809 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee"} err="failed to get container status \"d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee\": rpc error: code = NotFound desc = could not find container \"d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee\": container with ID starting with d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee not found: ID does not exist" Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.219669 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" path="/var/lib/kubelet/pods/eea5e508-8702-4085-b99d-43524ffd7dac/volumes" Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.344749 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.466862 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49pq9\" (UniqueName: \"kubernetes.io/projected/0548485b-4f03-47ba-8a13-4e3522451291-kube-api-access-49pq9\") pod \"0548485b-4f03-47ba-8a13-4e3522451291\" (UID: \"0548485b-4f03-47ba-8a13-4e3522451291\") " Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.474730 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0548485b-4f03-47ba-8a13-4e3522451291-kube-api-access-49pq9" (OuterVolumeSpecName: "kube-api-access-49pq9") pod "0548485b-4f03-47ba-8a13-4e3522451291" (UID: "0548485b-4f03-47ba-8a13-4e3522451291"). InnerVolumeSpecName "kube-api-access-49pq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.569325 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49pq9\" (UniqueName: \"kubernetes.io/projected/0548485b-4f03-47ba-8a13-4e3522451291-kube-api-access-49pq9\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.941472 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6v22l"] Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.980847 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.980851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" event={"ID":"0548485b-4f03-47ba-8a13-4e3522451291","Type":"ContainerDied","Data":"8465549329bfd92b52a0d5ceef063d0f55bdc478460f4cbdf9d413424cbc4cc5"} Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.980915 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8465549329bfd92b52a0d5ceef063d0f55bdc478460f4cbdf9d413424cbc4cc5" Mar 18 09:38:05 crc kubenswrapper[4778]: I0318 09:38:05.415547 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563772-tcj6t"] Mar 18 09:38:05 crc kubenswrapper[4778]: I0318 09:38:05.423431 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563772-tcj6t"] Mar 18 09:38:05 crc kubenswrapper[4778]: I0318 09:38:05.991838 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6v22l" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="registry-server" containerID="cri-o://68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940" gracePeriod=2 Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.198003 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15232b66-3433-4405-9feb-79055e892b3d" path="/var/lib/kubelet/pods/15232b66-3433-4405-9feb-79055e892b3d/volumes" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.435895 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.516047 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-utilities\") pod \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.516138 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-catalog-content\") pod \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.516261 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmv8p\" (UniqueName: \"kubernetes.io/projected/4d79e7e2-9db4-4307-8411-18b74d60b1b7-kube-api-access-zmv8p\") pod \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.525835 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d79e7e2-9db4-4307-8411-18b74d60b1b7-kube-api-access-zmv8p" (OuterVolumeSpecName: "kube-api-access-zmv8p") pod "4d79e7e2-9db4-4307-8411-18b74d60b1b7" (UID: "4d79e7e2-9db4-4307-8411-18b74d60b1b7"). InnerVolumeSpecName "kube-api-access-zmv8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.530712 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-utilities" (OuterVolumeSpecName: "utilities") pod "4d79e7e2-9db4-4307-8411-18b74d60b1b7" (UID: "4d79e7e2-9db4-4307-8411-18b74d60b1b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.602926 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d79e7e2-9db4-4307-8411-18b74d60b1b7" (UID: "4d79e7e2-9db4-4307-8411-18b74d60b1b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.619084 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.619120 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.619134 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmv8p\" (UniqueName: \"kubernetes.io/projected/4d79e7e2-9db4-4307-8411-18b74d60b1b7-kube-api-access-zmv8p\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.005972 4778 generic.go:334] "Generic (PLEG): container finished" podID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerID="68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940" exitCode=0 Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.006010 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerDied","Data":"68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940"} Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.006054 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.006112 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerDied","Data":"19c45c0d8cb539fb9cbc49d484d0d217fa556329caaf0c5f55191f47880349c3"} Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.006183 4778 scope.go:117] "RemoveContainer" containerID="68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.053894 4778 scope.go:117] "RemoveContainer" containerID="5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.058706 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6v22l"] Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.066825 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6v22l"] Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.082046 4778 scope.go:117] "RemoveContainer" containerID="4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.128976 4778 scope.go:117] "RemoveContainer" containerID="68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940" Mar 18 09:38:07 crc kubenswrapper[4778]: E0318 09:38:07.129416 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940\": container with ID starting with 68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940 not found: ID does not exist" containerID="68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.129449 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940"} err="failed to get container status \"68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940\": rpc error: code = NotFound desc = could not find container \"68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940\": container with ID starting with 68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940 not found: ID does not exist" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.129472 4778 scope.go:117] "RemoveContainer" containerID="5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708" Mar 18 09:38:07 crc kubenswrapper[4778]: E0318 09:38:07.129798 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708\": container with ID starting with 5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708 not found: ID does not exist" containerID="5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.129825 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708"} err="failed to get container status \"5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708\": rpc error: code = NotFound desc = could not find container \"5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708\": container with ID starting with 5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708 not found: ID does not exist" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.129844 4778 scope.go:117] "RemoveContainer" containerID="4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d" Mar 18 09:38:07 crc kubenswrapper[4778]: E0318 09:38:07.130373 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d\": container with ID starting with 4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d not found: ID does not exist" containerID="4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.130401 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d"} err="failed to get container status \"4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d\": rpc error: code = NotFound desc = could not find container \"4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d\": container with ID starting with 4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d not found: ID does not exist" Mar 18 09:38:08 crc kubenswrapper[4778]: I0318 09:38:08.206031 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" path="/var/lib/kubelet/pods/4d79e7e2-9db4-4307-8411-18b74d60b1b7/volumes" Mar 18 09:38:19 crc kubenswrapper[4778]: E0318 09:38:19.190243 4778 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.70:51892->38.102.83.70:35463: write tcp 38.102.83.70:51892->38.102.83.70:35463: write: broken pipe Mar 18 09:38:31 crc kubenswrapper[4778]: I0318 09:38:31.337978 4778 scope.go:117] "RemoveContainer" containerID="c8ccd760df68dbd5ce4bef875e9b41962b50e1c9d6413d0f1f66a324748d7c49" Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.065830 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.076572 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.084608 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.090940 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.097625 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.118647 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.129216 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.137705 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.145624 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.158230 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zd2d7"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.170616 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.180292 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.188220 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.195547 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.201151 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.209123 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.216090 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.221544 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.227587 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.233096 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zd2d7"] Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.206891 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b7ef620-077d-4a38-90ed-fed05ccba5d2" path="/var/lib/kubelet/pods/0b7ef620-077d-4a38-90ed-fed05ccba5d2/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.208084 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="154a89df-1c2e-4f86-bbf3-827d6443c04a" path="/var/lib/kubelet/pods/154a89df-1c2e-4f86-bbf3-827d6443c04a/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.209434 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe04bef-41cb-47c4-8031-141f8809e8cb" path="/var/lib/kubelet/pods/2fe04bef-41cb-47c4-8031-141f8809e8cb/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.210669 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" path="/var/lib/kubelet/pods/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.212983 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e3f07f1-8381-48a9-8ebb-9cd3a821783f" path="/var/lib/kubelet/pods/8e3f07f1-8381-48a9-8ebb-9cd3a821783f/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.215065 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" path="/var/lib/kubelet/pods/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.215781 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b33de03b-23ec-40c0-b309-0dd2024caf71" path="/var/lib/kubelet/pods/b33de03b-23ec-40c0-b309-0dd2024caf71/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.216487 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b989f767-d1ba-49fe-aebb-6aef120e0e22" path="/var/lib/kubelet/pods/b989f767-d1ba-49fe-aebb-6aef120e0e22/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.217800 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3315b0-1ace-422a-8049-3fd13fe46e65" path="/var/lib/kubelet/pods/bf3315b0-1ace-422a-8049-3fd13fe46e65/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.218563 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d85dc0a9-a7b0-4715-bc9d-974ac7657337" path="/var/lib/kubelet/pods/d85dc0a9-a7b0-4715-bc9d-974ac7657337/volumes" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.695349 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx"] Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.696848 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="extract-content" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.696874 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="extract-content" Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.696906 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="extract-utilities" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.696918 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="extract-utilities" Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.696940 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="registry-server" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.696952 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="registry-server" Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.696975 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="extract-content" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.696986 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="extract-content" Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.697004 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="registry-server" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.697014 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="registry-server" Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.697032 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0548485b-4f03-47ba-8a13-4e3522451291" containerName="oc" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.697043 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0548485b-4f03-47ba-8a13-4e3522451291" containerName="oc" Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.697063 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="extract-utilities" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.697073 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="extract-utilities" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.697452 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="registry-server" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.697480 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0548485b-4f03-47ba-8a13-4e3522451291" containerName="oc" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.697511 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="registry-server" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.698534 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.710246 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.710365 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.710365 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.710648 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.710688 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx"] Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.710834 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.840689 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.840795 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.840925 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjgc8\" (UniqueName: \"kubernetes.io/projected/136dbfab-32f1-40ee-b685-74411fbc06ba-kube-api-access-zjgc8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.841027 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.841101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.943427 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.943546 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.943723 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjgc8\" (UniqueName: \"kubernetes.io/projected/136dbfab-32f1-40ee-b685-74411fbc06ba-kube-api-access-zjgc8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.944445 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.944717 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.957896 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.958060 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.958442 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.959984 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.969244 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjgc8\" (UniqueName: \"kubernetes.io/projected/136dbfab-32f1-40ee-b685-74411fbc06ba-kube-api-access-zjgc8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:45 crc kubenswrapper[4778]: I0318 09:38:45.036052 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:45 crc kubenswrapper[4778]: I0318 09:38:45.649426 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx"] Mar 18 09:38:46 crc kubenswrapper[4778]: I0318 09:38:46.415076 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" event={"ID":"136dbfab-32f1-40ee-b685-74411fbc06ba","Type":"ContainerStarted","Data":"41c9d610db46c8d317c992030f024e19d0b3f9df9d698ee26c015c45a0a0b2aa"} Mar 18 09:38:46 crc kubenswrapper[4778]: I0318 09:38:46.415470 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" event={"ID":"136dbfab-32f1-40ee-b685-74411fbc06ba","Type":"ContainerStarted","Data":"e7e8db0b2fd629dec04b95406879d289021ad93e6d064630d6901e2c718d4842"} Mar 18 09:38:46 crc kubenswrapper[4778]: I0318 09:38:46.437930 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" podStartSLOduration=1.996853192 podStartE2EDuration="2.437908633s" podCreationTimestamp="2026-03-18 09:38:44 +0000 UTC" firstStartedPulling="2026-03-18 09:38:45.658507737 +0000 UTC m=+2192.233252577" lastFinishedPulling="2026-03-18 09:38:46.099563178 +0000 UTC m=+2192.674308018" observedRunningTime="2026-03-18 09:38:46.432863586 +0000 UTC m=+2193.007608436" watchObservedRunningTime="2026-03-18 09:38:46.437908633 +0000 UTC m=+2193.012653473" Mar 18 09:38:57 crc kubenswrapper[4778]: I0318 09:38:57.519957 4778 generic.go:334] "Generic (PLEG): container finished" podID="136dbfab-32f1-40ee-b685-74411fbc06ba" containerID="41c9d610db46c8d317c992030f024e19d0b3f9df9d698ee26c015c45a0a0b2aa" exitCode=0 Mar 18 09:38:57 crc kubenswrapper[4778]: I0318 09:38:57.520058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" event={"ID":"136dbfab-32f1-40ee-b685-74411fbc06ba","Type":"ContainerDied","Data":"41c9d610db46c8d317c992030f024e19d0b3f9df9d698ee26c015c45a0a0b2aa"} Mar 18 09:38:58 crc kubenswrapper[4778]: I0318 09:38:58.951217 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.034095 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ceph\") pod \"136dbfab-32f1-40ee-b685-74411fbc06ba\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.034429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-inventory\") pod \"136dbfab-32f1-40ee-b685-74411fbc06ba\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.034640 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-repo-setup-combined-ca-bundle\") pod \"136dbfab-32f1-40ee-b685-74411fbc06ba\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.034879 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjgc8\" (UniqueName: \"kubernetes.io/projected/136dbfab-32f1-40ee-b685-74411fbc06ba-kube-api-access-zjgc8\") pod \"136dbfab-32f1-40ee-b685-74411fbc06ba\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.034957 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ssh-key-openstack-edpm-ipam\") pod \"136dbfab-32f1-40ee-b685-74411fbc06ba\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.041173 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/136dbfab-32f1-40ee-b685-74411fbc06ba-kube-api-access-zjgc8" (OuterVolumeSpecName: "kube-api-access-zjgc8") pod "136dbfab-32f1-40ee-b685-74411fbc06ba" (UID: "136dbfab-32f1-40ee-b685-74411fbc06ba"). InnerVolumeSpecName "kube-api-access-zjgc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.041711 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ceph" (OuterVolumeSpecName: "ceph") pod "136dbfab-32f1-40ee-b685-74411fbc06ba" (UID: "136dbfab-32f1-40ee-b685-74411fbc06ba"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.048427 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "136dbfab-32f1-40ee-b685-74411fbc06ba" (UID: "136dbfab-32f1-40ee-b685-74411fbc06ba"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.071543 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "136dbfab-32f1-40ee-b685-74411fbc06ba" (UID: "136dbfab-32f1-40ee-b685-74411fbc06ba"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.076721 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-inventory" (OuterVolumeSpecName: "inventory") pod "136dbfab-32f1-40ee-b685-74411fbc06ba" (UID: "136dbfab-32f1-40ee-b685-74411fbc06ba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.138487 4778 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.138534 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjgc8\" (UniqueName: \"kubernetes.io/projected/136dbfab-32f1-40ee-b685-74411fbc06ba-kube-api-access-zjgc8\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.138548 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.138563 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.138576 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.546802 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" event={"ID":"136dbfab-32f1-40ee-b685-74411fbc06ba","Type":"ContainerDied","Data":"e7e8db0b2fd629dec04b95406879d289021ad93e6d064630d6901e2c718d4842"} Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.546868 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7e8db0b2fd629dec04b95406879d289021ad93e6d064630d6901e2c718d4842" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.546960 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.663984 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk"] Mar 18 09:38:59 crc kubenswrapper[4778]: E0318 09:38:59.664865 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136dbfab-32f1-40ee-b685-74411fbc06ba" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.664892 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="136dbfab-32f1-40ee-b685-74411fbc06ba" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.665149 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="136dbfab-32f1-40ee-b685-74411fbc06ba" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.665905 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.669400 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.677426 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.677600 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.680593 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.680851 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.687418 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk"] Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.754108 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.754486 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgsrs\" (UniqueName: \"kubernetes.io/projected/f4bddd5e-314b-49c0-963c-107e6798c40e-kube-api-access-sgsrs\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.754586 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.754869 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.755064 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.857445 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgsrs\" (UniqueName: \"kubernetes.io/projected/f4bddd5e-314b-49c0-963c-107e6798c40e-kube-api-access-sgsrs\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.857516 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.857575 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.857634 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.857688 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.863471 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.863601 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.868085 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.870756 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.881041 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgsrs\" (UniqueName: \"kubernetes.io/projected/f4bddd5e-314b-49c0-963c-107e6798c40e-kube-api-access-sgsrs\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.992231 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:39:00 crc kubenswrapper[4778]: I0318 09:39:00.619659 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk"] Mar 18 09:39:00 crc kubenswrapper[4778]: W0318 09:39:00.621646 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4bddd5e_314b_49c0_963c_107e6798c40e.slice/crio-caac60756c71869f718f55d75bf41c099232c88f6ff497cf326f1789ff881da3 WatchSource:0}: Error finding container caac60756c71869f718f55d75bf41c099232c88f6ff497cf326f1789ff881da3: Status 404 returned error can't find the container with id caac60756c71869f718f55d75bf41c099232c88f6ff497cf326f1789ff881da3 Mar 18 09:39:01 crc kubenswrapper[4778]: I0318 09:39:01.579163 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" event={"ID":"f4bddd5e-314b-49c0-963c-107e6798c40e","Type":"ContainerStarted","Data":"f2ea8edc9c4961bcef17cef1e281edb3e1211a2e5c2d2551a850dcaae9c256c3"} Mar 18 09:39:01 crc kubenswrapper[4778]: I0318 09:39:01.579603 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" event={"ID":"f4bddd5e-314b-49c0-963c-107e6798c40e","Type":"ContainerStarted","Data":"caac60756c71869f718f55d75bf41c099232c88f6ff497cf326f1789ff881da3"} Mar 18 09:39:01 crc kubenswrapper[4778]: I0318 09:39:01.608563 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" podStartSLOduration=2.117923818 podStartE2EDuration="2.608531832s" podCreationTimestamp="2026-03-18 09:38:59 +0000 UTC" firstStartedPulling="2026-03-18 09:39:00.624129857 +0000 UTC m=+2207.198874697" lastFinishedPulling="2026-03-18 09:39:01.114737881 +0000 UTC m=+2207.689482711" observedRunningTime="2026-03-18 09:39:01.605921611 +0000 UTC m=+2208.180666531" watchObservedRunningTime="2026-03-18 09:39:01.608531832 +0000 UTC m=+2208.183276712" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.431640 4778 scope.go:117] "RemoveContainer" containerID="fff08d52bb8eb70989a438cee10bfa00abf8f2f0d3255a5a38ba7fa52ac6fd7d" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.466455 4778 scope.go:117] "RemoveContainer" containerID="a0e4df9a818fec5131c555c760ba72656483292d6411393207bbb36928547cc0" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.527261 4778 scope.go:117] "RemoveContainer" containerID="e73cd1a7f2cc415ed8844ab76dbcafbf6d6af8f6453df69ed69f66ec8f314dde" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.596073 4778 scope.go:117] "RemoveContainer" containerID="df748c22cbd9cfd719213cf439a446ed8f2c405ec832bdc5f38d5aacebbce9a9" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.672560 4778 scope.go:117] "RemoveContainer" containerID="fd288c3256024cadd2bb212c37d37772aadd4cda1a6ce7e57e524f08cb5c87a0" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.715384 4778 scope.go:117] "RemoveContainer" containerID="624102bc1fa0c3850d7e6900b631b277f6bac2b359e2ccf5e40af6d9d87d6742" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.751565 4778 scope.go:117] "RemoveContainer" containerID="9d9ac3d0a2c7513d5a45e8e6d19a1411862166adf069327b3e174ecc2c3c3a28" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.418380 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-khwnk"] Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.421054 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.441836 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khwnk"] Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.533370 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-catalog-content\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.533474 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-utilities\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.533520 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhchc\" (UniqueName: \"kubernetes.io/projected/eee28de9-04d9-4210-87f7-b51710f5befc-kube-api-access-jhchc\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.634583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-utilities\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.634643 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhchc\" (UniqueName: \"kubernetes.io/projected/eee28de9-04d9-4210-87f7-b51710f5befc-kube-api-access-jhchc\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.634739 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-catalog-content\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.635247 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-catalog-content\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.635456 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-utilities\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.662413 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhchc\" (UniqueName: \"kubernetes.io/projected/eee28de9-04d9-4210-87f7-b51710f5befc-kube-api-access-jhchc\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.778809 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:38 crc kubenswrapper[4778]: I0318 09:39:38.230430 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khwnk"] Mar 18 09:39:38 crc kubenswrapper[4778]: I0318 09:39:38.914082 4778 generic.go:334] "Generic (PLEG): container finished" podID="eee28de9-04d9-4210-87f7-b51710f5befc" containerID="12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d" exitCode=0 Mar 18 09:39:38 crc kubenswrapper[4778]: I0318 09:39:38.914189 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerDied","Data":"12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d"} Mar 18 09:39:38 crc kubenswrapper[4778]: I0318 09:39:38.914348 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerStarted","Data":"52dcfc339804749e6cfb13e84a2f1b2c8e710e66910f70bd95fb38bf89d675f7"} Mar 18 09:39:39 crc kubenswrapper[4778]: I0318 09:39:39.923062 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerStarted","Data":"8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15"} Mar 18 09:39:41 crc kubenswrapper[4778]: I0318 09:39:41.940835 4778 generic.go:334] "Generic (PLEG): container finished" podID="eee28de9-04d9-4210-87f7-b51710f5befc" containerID="8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15" exitCode=0 Mar 18 09:39:41 crc kubenswrapper[4778]: I0318 09:39:41.940903 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerDied","Data":"8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15"} Mar 18 09:39:42 crc kubenswrapper[4778]: I0318 09:39:42.951943 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerStarted","Data":"14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd"} Mar 18 09:39:42 crc kubenswrapper[4778]: I0318 09:39:42.971540 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-khwnk" podStartSLOduration=2.523971043 podStartE2EDuration="5.971518961s" podCreationTimestamp="2026-03-18 09:39:37 +0000 UTC" firstStartedPulling="2026-03-18 09:39:38.915472201 +0000 UTC m=+2245.490217041" lastFinishedPulling="2026-03-18 09:39:42.363020119 +0000 UTC m=+2248.937764959" observedRunningTime="2026-03-18 09:39:42.970090572 +0000 UTC m=+2249.544835412" watchObservedRunningTime="2026-03-18 09:39:42.971518961 +0000 UTC m=+2249.546263811" Mar 18 09:39:47 crc kubenswrapper[4778]: I0318 09:39:47.779679 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:47 crc kubenswrapper[4778]: I0318 09:39:47.780269 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:48 crc kubenswrapper[4778]: I0318 09:39:48.829043 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-khwnk" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="registry-server" probeResult="failure" output=< Mar 18 09:39:48 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:39:48 crc kubenswrapper[4778]: > Mar 18 09:39:57 crc kubenswrapper[4778]: I0318 09:39:57.825694 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:57 crc kubenswrapper[4778]: I0318 09:39:57.878856 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:58 crc kubenswrapper[4778]: I0318 09:39:58.065975 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-khwnk"] Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.090455 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-khwnk" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="registry-server" containerID="cri-o://14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd" gracePeriod=2 Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.628250 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.702382 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-catalog-content\") pod \"eee28de9-04d9-4210-87f7-b51710f5befc\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.702443 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-utilities\") pod \"eee28de9-04d9-4210-87f7-b51710f5befc\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.702489 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhchc\" (UniqueName: \"kubernetes.io/projected/eee28de9-04d9-4210-87f7-b51710f5befc-kube-api-access-jhchc\") pod \"eee28de9-04d9-4210-87f7-b51710f5befc\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.704155 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-utilities" (OuterVolumeSpecName: "utilities") pod "eee28de9-04d9-4210-87f7-b51710f5befc" (UID: "eee28de9-04d9-4210-87f7-b51710f5befc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.711103 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee28de9-04d9-4210-87f7-b51710f5befc-kube-api-access-jhchc" (OuterVolumeSpecName: "kube-api-access-jhchc") pod "eee28de9-04d9-4210-87f7-b51710f5befc" (UID: "eee28de9-04d9-4210-87f7-b51710f5befc"). InnerVolumeSpecName "kube-api-access-jhchc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.805969 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.806507 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhchc\" (UniqueName: \"kubernetes.io/projected/eee28de9-04d9-4210-87f7-b51710f5befc-kube-api-access-jhchc\") on node \"crc\" DevicePath \"\"" Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.826140 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eee28de9-04d9-4210-87f7-b51710f5befc" (UID: "eee28de9-04d9-4210-87f7-b51710f5befc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.908515 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.099396 4778 generic.go:334] "Generic (PLEG): container finished" podID="eee28de9-04d9-4210-87f7-b51710f5befc" containerID="14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd" exitCode=0 Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.099500 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.100454 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerDied","Data":"14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd"} Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.100504 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerDied","Data":"52dcfc339804749e6cfb13e84a2f1b2c8e710e66910f70bd95fb38bf89d675f7"} Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.100548 4778 scope.go:117] "RemoveContainer" containerID="14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.142982 4778 scope.go:117] "RemoveContainer" containerID="8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.155549 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-khwnk"] Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.164110 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563780-vgggq"] Mar 18 09:40:00 crc kubenswrapper[4778]: E0318 09:40:00.164559 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="registry-server" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.164578 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="registry-server" Mar 18 09:40:00 crc kubenswrapper[4778]: E0318 09:40:00.164596 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="extract-content" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.164602 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="extract-content" Mar 18 09:40:00 crc kubenswrapper[4778]: E0318 09:40:00.164627 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="extract-utilities" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.164633 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="extract-utilities" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.164800 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="registry-server" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.165478 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.168007 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.168367 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.169506 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.172805 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-khwnk"] Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.173034 4778 scope.go:117] "RemoveContainer" containerID="12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.183056 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563780-vgggq"] Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.207074 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" path="/var/lib/kubelet/pods/eee28de9-04d9-4210-87f7-b51710f5befc/volumes" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.217507 4778 scope.go:117] "RemoveContainer" containerID="14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd" Mar 18 09:40:00 crc kubenswrapper[4778]: E0318 09:40:00.218478 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd\": container with ID starting with 14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd not found: ID does not exist" containerID="14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.218518 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd"} err="failed to get container status \"14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd\": rpc error: code = NotFound desc = could not find container \"14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd\": container with ID starting with 14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd not found: ID does not exist" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.218547 4778 scope.go:117] "RemoveContainer" containerID="8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15" Mar 18 09:40:00 crc kubenswrapper[4778]: E0318 09:40:00.218955 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15\": container with ID starting with 8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15 not found: ID does not exist" containerID="8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.218985 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15"} err="failed to get container status \"8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15\": rpc error: code = NotFound desc = could not find container \"8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15\": container with ID starting with 8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15 not found: ID does not exist" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.219001 4778 scope.go:117] "RemoveContainer" containerID="12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d" Mar 18 09:40:00 crc kubenswrapper[4778]: E0318 09:40:00.219233 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d\": container with ID starting with 12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d not found: ID does not exist" containerID="12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.219266 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d"} err="failed to get container status \"12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d\": rpc error: code = NotFound desc = could not find container \"12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d\": container with ID starting with 12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d not found: ID does not exist" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.317513 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgrn9\" (UniqueName: \"kubernetes.io/projected/ed393452-0d17-4c60-b37b-544b21c09da1-kube-api-access-sgrn9\") pod \"auto-csr-approver-29563780-vgggq\" (UID: \"ed393452-0d17-4c60-b37b-544b21c09da1\") " pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.419439 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgrn9\" (UniqueName: \"kubernetes.io/projected/ed393452-0d17-4c60-b37b-544b21c09da1-kube-api-access-sgrn9\") pod \"auto-csr-approver-29563780-vgggq\" (UID: \"ed393452-0d17-4c60-b37b-544b21c09da1\") " pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.444679 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgrn9\" (UniqueName: \"kubernetes.io/projected/ed393452-0d17-4c60-b37b-544b21c09da1-kube-api-access-sgrn9\") pod \"auto-csr-approver-29563780-vgggq\" (UID: \"ed393452-0d17-4c60-b37b-544b21c09da1\") " pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.571155 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.999400 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563780-vgggq"] Mar 18 09:40:01 crc kubenswrapper[4778]: I0318 09:40:01.107921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563780-vgggq" event={"ID":"ed393452-0d17-4c60-b37b-544b21c09da1","Type":"ContainerStarted","Data":"114b3c91b330bc93cc75643b229353a42c117b6ac1efb3eeced8a0d86a2ebe7c"} Mar 18 09:40:03 crc kubenswrapper[4778]: I0318 09:40:03.134739 4778 generic.go:334] "Generic (PLEG): container finished" podID="ed393452-0d17-4c60-b37b-544b21c09da1" containerID="9f45f4032f3621f6cd43ea95d13369122ace0eb37b6189c6643a14332da3a74a" exitCode=0 Mar 18 09:40:03 crc kubenswrapper[4778]: I0318 09:40:03.134781 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563780-vgggq" event={"ID":"ed393452-0d17-4c60-b37b-544b21c09da1","Type":"ContainerDied","Data":"9f45f4032f3621f6cd43ea95d13369122ace0eb37b6189c6643a14332da3a74a"} Mar 18 09:40:04 crc kubenswrapper[4778]: I0318 09:40:04.500426 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:04 crc kubenswrapper[4778]: I0318 09:40:04.599427 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgrn9\" (UniqueName: \"kubernetes.io/projected/ed393452-0d17-4c60-b37b-544b21c09da1-kube-api-access-sgrn9\") pod \"ed393452-0d17-4c60-b37b-544b21c09da1\" (UID: \"ed393452-0d17-4c60-b37b-544b21c09da1\") " Mar 18 09:40:04 crc kubenswrapper[4778]: I0318 09:40:04.606018 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed393452-0d17-4c60-b37b-544b21c09da1-kube-api-access-sgrn9" (OuterVolumeSpecName: "kube-api-access-sgrn9") pod "ed393452-0d17-4c60-b37b-544b21c09da1" (UID: "ed393452-0d17-4c60-b37b-544b21c09da1"). InnerVolumeSpecName "kube-api-access-sgrn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:40:04 crc kubenswrapper[4778]: I0318 09:40:04.702022 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgrn9\" (UniqueName: \"kubernetes.io/projected/ed393452-0d17-4c60-b37b-544b21c09da1-kube-api-access-sgrn9\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:05 crc kubenswrapper[4778]: I0318 09:40:05.155882 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563780-vgggq" event={"ID":"ed393452-0d17-4c60-b37b-544b21c09da1","Type":"ContainerDied","Data":"114b3c91b330bc93cc75643b229353a42c117b6ac1efb3eeced8a0d86a2ebe7c"} Mar 18 09:40:05 crc kubenswrapper[4778]: I0318 09:40:05.155924 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:05 crc kubenswrapper[4778]: I0318 09:40:05.155928 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="114b3c91b330bc93cc75643b229353a42c117b6ac1efb3eeced8a0d86a2ebe7c" Mar 18 09:40:05 crc kubenswrapper[4778]: I0318 09:40:05.574164 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563774-jn4n5"] Mar 18 09:40:05 crc kubenswrapper[4778]: I0318 09:40:05.583745 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563774-jn4n5"] Mar 18 09:40:06 crc kubenswrapper[4778]: I0318 09:40:06.198993 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315606e3-7197-4234-b672-400a86339d27" path="/var/lib/kubelet/pods/315606e3-7197-4234-b672-400a86339d27/volumes" Mar 18 09:40:30 crc kubenswrapper[4778]: I0318 09:40:30.147783 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:40:30 crc kubenswrapper[4778]: I0318 09:40:30.148258 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:40:31 crc kubenswrapper[4778]: I0318 09:40:31.895085 4778 scope.go:117] "RemoveContainer" containerID="3e7ed49b01f49625749fbe5496f4ace13851a2f40c5fbcd9633d28b842edcbb0" Mar 18 09:40:31 crc kubenswrapper[4778]: I0318 09:40:31.964569 4778 scope.go:117] "RemoveContainer" containerID="16287400dde13e1c43e43b5d82fb77152a697e54c70364b3f589e427eeb0ea66" Mar 18 09:40:32 crc kubenswrapper[4778]: I0318 09:40:32.014705 4778 scope.go:117] "RemoveContainer" containerID="40ec8676e1c1ccc56a543cb5a42de5f34fb52e822133f45c8ebee2d755dab39f" Mar 18 09:40:39 crc kubenswrapper[4778]: I0318 09:40:39.467281 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4bddd5e-314b-49c0-963c-107e6798c40e" containerID="f2ea8edc9c4961bcef17cef1e281edb3e1211a2e5c2d2551a850dcaae9c256c3" exitCode=0 Mar 18 09:40:39 crc kubenswrapper[4778]: I0318 09:40:39.467351 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" event={"ID":"f4bddd5e-314b-49c0-963c-107e6798c40e","Type":"ContainerDied","Data":"f2ea8edc9c4961bcef17cef1e281edb3e1211a2e5c2d2551a850dcaae9c256c3"} Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.946318 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.970287 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-bootstrap-combined-ca-bundle\") pod \"f4bddd5e-314b-49c0-963c-107e6798c40e\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.970381 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-inventory\") pod \"f4bddd5e-314b-49c0-963c-107e6798c40e\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.973254 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgsrs\" (UniqueName: \"kubernetes.io/projected/f4bddd5e-314b-49c0-963c-107e6798c40e-kube-api-access-sgsrs\") pod \"f4bddd5e-314b-49c0-963c-107e6798c40e\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.973347 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ssh-key-openstack-edpm-ipam\") pod \"f4bddd5e-314b-49c0-963c-107e6798c40e\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.973373 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ceph\") pod \"f4bddd5e-314b-49c0-963c-107e6798c40e\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.985572 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ceph" (OuterVolumeSpecName: "ceph") pod "f4bddd5e-314b-49c0-963c-107e6798c40e" (UID: "f4bddd5e-314b-49c0-963c-107e6798c40e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.985593 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f4bddd5e-314b-49c0-963c-107e6798c40e" (UID: "f4bddd5e-314b-49c0-963c-107e6798c40e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.985598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4bddd5e-314b-49c0-963c-107e6798c40e-kube-api-access-sgsrs" (OuterVolumeSpecName: "kube-api-access-sgsrs") pod "f4bddd5e-314b-49c0-963c-107e6798c40e" (UID: "f4bddd5e-314b-49c0-963c-107e6798c40e"). InnerVolumeSpecName "kube-api-access-sgsrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.005179 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-inventory" (OuterVolumeSpecName: "inventory") pod "f4bddd5e-314b-49c0-963c-107e6798c40e" (UID: "f4bddd5e-314b-49c0-963c-107e6798c40e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.005532 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f4bddd5e-314b-49c0-963c-107e6798c40e" (UID: "f4bddd5e-314b-49c0-963c-107e6798c40e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.075729 4778 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.075769 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.075780 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgsrs\" (UniqueName: \"kubernetes.io/projected/f4bddd5e-314b-49c0-963c-107e6798c40e-kube-api-access-sgsrs\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.075788 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.075796 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.492944 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" event={"ID":"f4bddd5e-314b-49c0-963c-107e6798c40e","Type":"ContainerDied","Data":"caac60756c71869f718f55d75bf41c099232c88f6ff497cf326f1789ff881da3"} Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.492985 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.492991 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caac60756c71869f718f55d75bf41c099232c88f6ff497cf326f1789ff881da3" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.599888 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5"] Mar 18 09:40:41 crc kubenswrapper[4778]: E0318 09:40:41.600338 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4bddd5e-314b-49c0-963c-107e6798c40e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.600361 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4bddd5e-314b-49c0-963c-107e6798c40e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 09:40:41 crc kubenswrapper[4778]: E0318 09:40:41.600378 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed393452-0d17-4c60-b37b-544b21c09da1" containerName="oc" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.600387 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed393452-0d17-4c60-b37b-544b21c09da1" containerName="oc" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.600780 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4bddd5e-314b-49c0-963c-107e6798c40e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.600819 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed393452-0d17-4c60-b37b-544b21c09da1" containerName="oc" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.601585 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.606526 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.606526 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.606614 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.606702 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.606771 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.624887 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5"] Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.684297 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhwpk\" (UniqueName: \"kubernetes.io/projected/d44d6afe-0030-4d9d-9fa7-f75274eff578-kube-api-access-rhwpk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.684432 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.684595 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.684744 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.786872 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhwpk\" (UniqueName: \"kubernetes.io/projected/d44d6afe-0030-4d9d-9fa7-f75274eff578-kube-api-access-rhwpk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.787012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.787128 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.787265 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.792280 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.793092 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.794145 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.806249 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhwpk\" (UniqueName: \"kubernetes.io/projected/d44d6afe-0030-4d9d-9fa7-f75274eff578-kube-api-access-rhwpk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.920262 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:42 crc kubenswrapper[4778]: I0318 09:40:42.544311 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5"] Mar 18 09:40:43 crc kubenswrapper[4778]: I0318 09:40:43.515027 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" event={"ID":"d44d6afe-0030-4d9d-9fa7-f75274eff578","Type":"ContainerStarted","Data":"101b85884f51ecbe703c99472efa3468c6a22a1f91c36bdb3336321d929be59a"} Mar 18 09:40:43 crc kubenswrapper[4778]: I0318 09:40:43.515439 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" event={"ID":"d44d6afe-0030-4d9d-9fa7-f75274eff578","Type":"ContainerStarted","Data":"28aece455218dcc9f1c2d64fb9c61409c6f2bbf3a12733a124354a4dba544ba1"} Mar 18 09:40:43 crc kubenswrapper[4778]: I0318 09:40:43.540964 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" podStartSLOduration=1.9570557979999998 podStartE2EDuration="2.54089267s" podCreationTimestamp="2026-03-18 09:40:41 +0000 UTC" firstStartedPulling="2026-03-18 09:40:42.558728867 +0000 UTC m=+2309.133473717" lastFinishedPulling="2026-03-18 09:40:43.142565759 +0000 UTC m=+2309.717310589" observedRunningTime="2026-03-18 09:40:43.537529879 +0000 UTC m=+2310.112274759" watchObservedRunningTime="2026-03-18 09:40:43.54089267 +0000 UTC m=+2310.115637550" Mar 18 09:41:00 crc kubenswrapper[4778]: I0318 09:41:00.147246 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:41:00 crc kubenswrapper[4778]: I0318 09:41:00.147768 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:41:08 crc kubenswrapper[4778]: I0318 09:41:08.735460 4778 generic.go:334] "Generic (PLEG): container finished" podID="d44d6afe-0030-4d9d-9fa7-f75274eff578" containerID="101b85884f51ecbe703c99472efa3468c6a22a1f91c36bdb3336321d929be59a" exitCode=0 Mar 18 09:41:08 crc kubenswrapper[4778]: I0318 09:41:08.735545 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" event={"ID":"d44d6afe-0030-4d9d-9fa7-f75274eff578","Type":"ContainerDied","Data":"101b85884f51ecbe703c99472efa3468c6a22a1f91c36bdb3336321d929be59a"} Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.178567 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.231661 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ssh-key-openstack-edpm-ipam\") pod \"d44d6afe-0030-4d9d-9fa7-f75274eff578\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.231814 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhwpk\" (UniqueName: \"kubernetes.io/projected/d44d6afe-0030-4d9d-9fa7-f75274eff578-kube-api-access-rhwpk\") pod \"d44d6afe-0030-4d9d-9fa7-f75274eff578\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.231881 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-inventory\") pod \"d44d6afe-0030-4d9d-9fa7-f75274eff578\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.232003 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ceph\") pod \"d44d6afe-0030-4d9d-9fa7-f75274eff578\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.237891 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ceph" (OuterVolumeSpecName: "ceph") pod "d44d6afe-0030-4d9d-9fa7-f75274eff578" (UID: "d44d6afe-0030-4d9d-9fa7-f75274eff578"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.238585 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d44d6afe-0030-4d9d-9fa7-f75274eff578-kube-api-access-rhwpk" (OuterVolumeSpecName: "kube-api-access-rhwpk") pod "d44d6afe-0030-4d9d-9fa7-f75274eff578" (UID: "d44d6afe-0030-4d9d-9fa7-f75274eff578"). InnerVolumeSpecName "kube-api-access-rhwpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.276331 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d44d6afe-0030-4d9d-9fa7-f75274eff578" (UID: "d44d6afe-0030-4d9d-9fa7-f75274eff578"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.277985 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-inventory" (OuterVolumeSpecName: "inventory") pod "d44d6afe-0030-4d9d-9fa7-f75274eff578" (UID: "d44d6afe-0030-4d9d-9fa7-f75274eff578"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.334989 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.335033 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhwpk\" (UniqueName: \"kubernetes.io/projected/d44d6afe-0030-4d9d-9fa7-f75274eff578-kube-api-access-rhwpk\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.335047 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.335058 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.759757 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" event={"ID":"d44d6afe-0030-4d9d-9fa7-f75274eff578","Type":"ContainerDied","Data":"28aece455218dcc9f1c2d64fb9c61409c6f2bbf3a12733a124354a4dba544ba1"} Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.759799 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28aece455218dcc9f1c2d64fb9c61409c6f2bbf3a12733a124354a4dba544ba1" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.759849 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.856559 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9"] Mar 18 09:41:10 crc kubenswrapper[4778]: E0318 09:41:10.857309 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d44d6afe-0030-4d9d-9fa7-f75274eff578" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.857340 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d44d6afe-0030-4d9d-9fa7-f75274eff578" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.857572 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d44d6afe-0030-4d9d-9fa7-f75274eff578" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.858416 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.861157 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.861309 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.861354 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.861440 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.862364 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.868825 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9"] Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.947338 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.947415 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms2xz\" (UniqueName: \"kubernetes.io/projected/5e5ecb95-ba90-4f70-ae42-63e71026ffef-kube-api-access-ms2xz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.947460 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.947492 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.049689 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.049769 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms2xz\" (UniqueName: \"kubernetes.io/projected/5e5ecb95-ba90-4f70-ae42-63e71026ffef-kube-api-access-ms2xz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.049821 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.049855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.055399 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.055403 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.056162 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.066717 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms2xz\" (UniqueName: \"kubernetes.io/projected/5e5ecb95-ba90-4f70-ae42-63e71026ffef-kube-api-access-ms2xz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.210421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.734591 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9"] Mar 18 09:41:11 crc kubenswrapper[4778]: W0318 09:41:11.737529 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e5ecb95_ba90_4f70_ae42_63e71026ffef.slice/crio-5362db78a90a4c66c68f2fb7c300d45fbeafe52062ec9d6f77048c372ab29d53 WatchSource:0}: Error finding container 5362db78a90a4c66c68f2fb7c300d45fbeafe52062ec9d6f77048c372ab29d53: Status 404 returned error can't find the container with id 5362db78a90a4c66c68f2fb7c300d45fbeafe52062ec9d6f77048c372ab29d53 Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.768819 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" event={"ID":"5e5ecb95-ba90-4f70-ae42-63e71026ffef","Type":"ContainerStarted","Data":"5362db78a90a4c66c68f2fb7c300d45fbeafe52062ec9d6f77048c372ab29d53"} Mar 18 09:41:12 crc kubenswrapper[4778]: I0318 09:41:12.778249 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" event={"ID":"5e5ecb95-ba90-4f70-ae42-63e71026ffef","Type":"ContainerStarted","Data":"27c9136b748cf6c2f18636f6fd7d7d19fbcf9b96dbd5072132c2ae8ae1540e5b"} Mar 18 09:41:17 crc kubenswrapper[4778]: I0318 09:41:17.823006 4778 generic.go:334] "Generic (PLEG): container finished" podID="5e5ecb95-ba90-4f70-ae42-63e71026ffef" containerID="27c9136b748cf6c2f18636f6fd7d7d19fbcf9b96dbd5072132c2ae8ae1540e5b" exitCode=0 Mar 18 09:41:17 crc kubenswrapper[4778]: I0318 09:41:17.823132 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" event={"ID":"5e5ecb95-ba90-4f70-ae42-63e71026ffef","Type":"ContainerDied","Data":"27c9136b748cf6c2f18636f6fd7d7d19fbcf9b96dbd5072132c2ae8ae1540e5b"} Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.271293 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.334077 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ceph\") pod \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.334140 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ssh-key-openstack-edpm-ipam\") pod \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.334238 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-inventory\") pod \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.334453 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms2xz\" (UniqueName: \"kubernetes.io/projected/5e5ecb95-ba90-4f70-ae42-63e71026ffef-kube-api-access-ms2xz\") pod \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.340786 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e5ecb95-ba90-4f70-ae42-63e71026ffef-kube-api-access-ms2xz" (OuterVolumeSpecName: "kube-api-access-ms2xz") pod "5e5ecb95-ba90-4f70-ae42-63e71026ffef" (UID: "5e5ecb95-ba90-4f70-ae42-63e71026ffef"). InnerVolumeSpecName "kube-api-access-ms2xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.343014 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ceph" (OuterVolumeSpecName: "ceph") pod "5e5ecb95-ba90-4f70-ae42-63e71026ffef" (UID: "5e5ecb95-ba90-4f70-ae42-63e71026ffef"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.364621 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5e5ecb95-ba90-4f70-ae42-63e71026ffef" (UID: "5e5ecb95-ba90-4f70-ae42-63e71026ffef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.379286 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-inventory" (OuterVolumeSpecName: "inventory") pod "5e5ecb95-ba90-4f70-ae42-63e71026ffef" (UID: "5e5ecb95-ba90-4f70-ae42-63e71026ffef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.437189 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms2xz\" (UniqueName: \"kubernetes.io/projected/5e5ecb95-ba90-4f70-ae42-63e71026ffef-kube-api-access-ms2xz\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.437249 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.437264 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.437276 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.848684 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" event={"ID":"5e5ecb95-ba90-4f70-ae42-63e71026ffef","Type":"ContainerDied","Data":"5362db78a90a4c66c68f2fb7c300d45fbeafe52062ec9d6f77048c372ab29d53"} Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.848767 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5362db78a90a4c66c68f2fb7c300d45fbeafe52062ec9d6f77048c372ab29d53" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.848707 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.943251 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82"] Mar 18 09:41:19 crc kubenswrapper[4778]: E0318 09:41:19.943584 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5ecb95-ba90-4f70-ae42-63e71026ffef" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.943616 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5ecb95-ba90-4f70-ae42-63e71026ffef" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.943783 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e5ecb95-ba90-4f70-ae42-63e71026ffef" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.944599 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.947754 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.948393 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.948875 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.950974 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.955655 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.962012 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82"] Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.047653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.047735 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.047894 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.047939 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdnqj\" (UniqueName: \"kubernetes.io/projected/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-kube-api-access-jdnqj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.150494 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.151585 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdnqj\" (UniqueName: \"kubernetes.io/projected/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-kube-api-access-jdnqj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.151825 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.151923 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.156169 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.156589 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.158859 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.179014 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdnqj\" (UniqueName: \"kubernetes.io/projected/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-kube-api-access-jdnqj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.265918 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.620966 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82"] Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.860785 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" event={"ID":"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50","Type":"ContainerStarted","Data":"b0e26c8012e479cf5cd6cde6333b5f580c7eeb43d328019a02a961863e5192bc"} Mar 18 09:41:22 crc kubenswrapper[4778]: I0318 09:41:22.880765 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" event={"ID":"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50","Type":"ContainerStarted","Data":"478f434742f905b8f2086a34869fb64730d5134c202716140a2d3e6b0f090ffb"} Mar 18 09:41:22 crc kubenswrapper[4778]: I0318 09:41:22.912438 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" podStartSLOduration=2.398803073 podStartE2EDuration="3.912417428s" podCreationTimestamp="2026-03-18 09:41:19 +0000 UTC" firstStartedPulling="2026-03-18 09:41:20.624517976 +0000 UTC m=+2347.199262816" lastFinishedPulling="2026-03-18 09:41:22.138132331 +0000 UTC m=+2348.712877171" observedRunningTime="2026-03-18 09:41:22.902615142 +0000 UTC m=+2349.477360022" watchObservedRunningTime="2026-03-18 09:41:22.912417428 +0000 UTC m=+2349.487162288" Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.147763 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.148618 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.148685 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.149833 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.149940 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" gracePeriod=600 Mar 18 09:41:30 crc kubenswrapper[4778]: E0318 09:41:30.275017 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.954990 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" exitCode=0 Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.955038 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9"} Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.955105 4778 scope.go:117] "RemoveContainer" containerID="0190a17d7c9066ee03765200824080dd651faade8ac0a106f1878805db93f225" Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.955967 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:41:30 crc kubenswrapper[4778]: E0318 09:41:30.956346 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:41:32 crc kubenswrapper[4778]: I0318 09:41:32.146318 4778 scope.go:117] "RemoveContainer" containerID="ae502ee49eb38287e9aaaa3ab0077cb1ef93b09b02a84b50a76e8fa209d6ad0c" Mar 18 09:41:41 crc kubenswrapper[4778]: I0318 09:41:41.187081 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:41:41 crc kubenswrapper[4778]: E0318 09:41:41.188121 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:41:54 crc kubenswrapper[4778]: I0318 09:41:54.191723 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:41:54 crc kubenswrapper[4778]: E0318 09:41:54.192649 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:41:57 crc kubenswrapper[4778]: I0318 09:41:57.233057 4778 generic.go:334] "Generic (PLEG): container finished" podID="5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" containerID="478f434742f905b8f2086a34869fb64730d5134c202716140a2d3e6b0f090ffb" exitCode=0 Mar 18 09:41:57 crc kubenswrapper[4778]: I0318 09:41:57.233157 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" event={"ID":"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50","Type":"ContainerDied","Data":"478f434742f905b8f2086a34869fb64730d5134c202716140a2d3e6b0f090ffb"} Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.723628 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.829278 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ceph\") pod \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.829374 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdnqj\" (UniqueName: \"kubernetes.io/projected/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-kube-api-access-jdnqj\") pod \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.829539 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ssh-key-openstack-edpm-ipam\") pod \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.829662 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-inventory\") pod \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.886391 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ceph" (OuterVolumeSpecName: "ceph") pod "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" (UID: "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.886904 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-kube-api-access-jdnqj" (OuterVolumeSpecName: "kube-api-access-jdnqj") pod "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" (UID: "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50"). InnerVolumeSpecName "kube-api-access-jdnqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.892880 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" (UID: "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.892916 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-inventory" (OuterVolumeSpecName: "inventory") pod "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" (UID: "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.932600 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.932638 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdnqj\" (UniqueName: \"kubernetes.io/projected/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-kube-api-access-jdnqj\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.932651 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.932662 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.255238 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" event={"ID":"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50","Type":"ContainerDied","Data":"b0e26c8012e479cf5cd6cde6333b5f580c7eeb43d328019a02a961863e5192bc"} Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.255273 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0e26c8012e479cf5cd6cde6333b5f580c7eeb43d328019a02a961863e5192bc" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.255329 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.355168 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl"] Mar 18 09:41:59 crc kubenswrapper[4778]: E0318 09:41:59.355708 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.355736 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.355982 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.356928 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.362929 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl"] Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.368010 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.368271 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.368472 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.369824 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.372073 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.444258 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4775b\" (UniqueName: \"kubernetes.io/projected/34acd7f6-6263-4871-892c-02835ebbab27-kube-api-access-4775b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.444313 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.444338 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.444374 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.547679 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4775b\" (UniqueName: \"kubernetes.io/projected/34acd7f6-6263-4871-892c-02835ebbab27-kube-api-access-4775b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.547793 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.547834 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.547890 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.559857 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.559875 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.562538 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.565124 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4775b\" (UniqueName: \"kubernetes.io/projected/34acd7f6-6263-4871-892c-02835ebbab27-kube-api-access-4775b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.679520 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.011747 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl"] Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.137587 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563782-zn5kg"] Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.138623 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.140988 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.141721 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.142291 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.152007 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563782-zn5kg"] Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.262830 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/d3895116-2d67-4e3c-9f3e-e04d3cfe0518-kube-api-access-tbbln\") pod \"auto-csr-approver-29563782-zn5kg\" (UID: \"d3895116-2d67-4e3c-9f3e-e04d3cfe0518\") " pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.263688 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" event={"ID":"34acd7f6-6263-4871-892c-02835ebbab27","Type":"ContainerStarted","Data":"64dff72ef9bbd5903ef3f081c89b41c6a089a4023a0fbd63c67b48c1ad47875d"} Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.365448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/d3895116-2d67-4e3c-9f3e-e04d3cfe0518-kube-api-access-tbbln\") pod \"auto-csr-approver-29563782-zn5kg\" (UID: \"d3895116-2d67-4e3c-9f3e-e04d3cfe0518\") " pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.391058 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/d3895116-2d67-4e3c-9f3e-e04d3cfe0518-kube-api-access-tbbln\") pod \"auto-csr-approver-29563782-zn5kg\" (UID: \"d3895116-2d67-4e3c-9f3e-e04d3cfe0518\") " pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.460588 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.933507 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563782-zn5kg"] Mar 18 09:42:00 crc kubenswrapper[4778]: W0318 09:42:00.938786 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3895116_2d67_4e3c_9f3e_e04d3cfe0518.slice/crio-4d07ed75007e2ce8c60dbdf1f56e8a7facf36f0283e4917d0b5f7757b08dd0b6 WatchSource:0}: Error finding container 4d07ed75007e2ce8c60dbdf1f56e8a7facf36f0283e4917d0b5f7757b08dd0b6: Status 404 returned error can't find the container with id 4d07ed75007e2ce8c60dbdf1f56e8a7facf36f0283e4917d0b5f7757b08dd0b6 Mar 18 09:42:01 crc kubenswrapper[4778]: I0318 09:42:01.272906 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" event={"ID":"d3895116-2d67-4e3c-9f3e-e04d3cfe0518","Type":"ContainerStarted","Data":"4d07ed75007e2ce8c60dbdf1f56e8a7facf36f0283e4917d0b5f7757b08dd0b6"} Mar 18 09:42:01 crc kubenswrapper[4778]: I0318 09:42:01.275276 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" event={"ID":"34acd7f6-6263-4871-892c-02835ebbab27","Type":"ContainerStarted","Data":"be013a651fa0255702eeae0ccf13a8dfade5d16e54b7dd5fc488b36567907797"} Mar 18 09:42:01 crc kubenswrapper[4778]: I0318 09:42:01.316287 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" podStartSLOduration=1.910295154 podStartE2EDuration="2.316262393s" podCreationTimestamp="2026-03-18 09:41:59 +0000 UTC" firstStartedPulling="2026-03-18 09:42:00.018585542 +0000 UTC m=+2386.593330392" lastFinishedPulling="2026-03-18 09:42:00.424552791 +0000 UTC m=+2386.999297631" observedRunningTime="2026-03-18 09:42:01.297276698 +0000 UTC m=+2387.872021568" watchObservedRunningTime="2026-03-18 09:42:01.316262393 +0000 UTC m=+2387.891007263" Mar 18 09:42:02 crc kubenswrapper[4778]: I0318 09:42:02.282843 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" event={"ID":"d3895116-2d67-4e3c-9f3e-e04d3cfe0518","Type":"ContainerStarted","Data":"c1ff920321931ae33985b21e2caf3b4db031e28a61271d5d1f2c36e681d955e6"} Mar 18 09:42:02 crc kubenswrapper[4778]: I0318 09:42:02.305918 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" podStartSLOduration=1.369680112 podStartE2EDuration="2.305892489s" podCreationTimestamp="2026-03-18 09:42:00 +0000 UTC" firstStartedPulling="2026-03-18 09:42:00.94406543 +0000 UTC m=+2387.518810280" lastFinishedPulling="2026-03-18 09:42:01.880277817 +0000 UTC m=+2388.455022657" observedRunningTime="2026-03-18 09:42:02.296643248 +0000 UTC m=+2388.871388098" watchObservedRunningTime="2026-03-18 09:42:02.305892489 +0000 UTC m=+2388.880637339" Mar 18 09:42:03 crc kubenswrapper[4778]: I0318 09:42:03.291748 4778 generic.go:334] "Generic (PLEG): container finished" podID="d3895116-2d67-4e3c-9f3e-e04d3cfe0518" containerID="c1ff920321931ae33985b21e2caf3b4db031e28a61271d5d1f2c36e681d955e6" exitCode=0 Mar 18 09:42:03 crc kubenswrapper[4778]: I0318 09:42:03.291851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" event={"ID":"d3895116-2d67-4e3c-9f3e-e04d3cfe0518","Type":"ContainerDied","Data":"c1ff920321931ae33985b21e2caf3b4db031e28a61271d5d1f2c36e681d955e6"} Mar 18 09:42:04 crc kubenswrapper[4778]: I0318 09:42:04.306174 4778 generic.go:334] "Generic (PLEG): container finished" podID="34acd7f6-6263-4871-892c-02835ebbab27" containerID="be013a651fa0255702eeae0ccf13a8dfade5d16e54b7dd5fc488b36567907797" exitCode=0 Mar 18 09:42:04 crc kubenswrapper[4778]: I0318 09:42:04.306291 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" event={"ID":"34acd7f6-6263-4871-892c-02835ebbab27","Type":"ContainerDied","Data":"be013a651fa0255702eeae0ccf13a8dfade5d16e54b7dd5fc488b36567907797"} Mar 18 09:42:04 crc kubenswrapper[4778]: I0318 09:42:04.651427 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:04 crc kubenswrapper[4778]: I0318 09:42:04.746470 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/d3895116-2d67-4e3c-9f3e-e04d3cfe0518-kube-api-access-tbbln\") pod \"d3895116-2d67-4e3c-9f3e-e04d3cfe0518\" (UID: \"d3895116-2d67-4e3c-9f3e-e04d3cfe0518\") " Mar 18 09:42:04 crc kubenswrapper[4778]: I0318 09:42:04.753109 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3895116-2d67-4e3c-9f3e-e04d3cfe0518-kube-api-access-tbbln" (OuterVolumeSpecName: "kube-api-access-tbbln") pod "d3895116-2d67-4e3c-9f3e-e04d3cfe0518" (UID: "d3895116-2d67-4e3c-9f3e-e04d3cfe0518"). InnerVolumeSpecName "kube-api-access-tbbln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:42:04 crc kubenswrapper[4778]: I0318 09:42:04.848773 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/d3895116-2d67-4e3c-9f3e-e04d3cfe0518-kube-api-access-tbbln\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.319835 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.320011 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" event={"ID":"d3895116-2d67-4e3c-9f3e-e04d3cfe0518","Type":"ContainerDied","Data":"4d07ed75007e2ce8c60dbdf1f56e8a7facf36f0283e4917d0b5f7757b08dd0b6"} Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.320401 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d07ed75007e2ce8c60dbdf1f56e8a7facf36f0283e4917d0b5f7757b08dd0b6" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.379371 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563776-bxwcm"] Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.386741 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563776-bxwcm"] Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.730680 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.868211 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ssh-key-openstack-edpm-ipam\") pod \"34acd7f6-6263-4871-892c-02835ebbab27\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.868341 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ceph\") pod \"34acd7f6-6263-4871-892c-02835ebbab27\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.868400 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4775b\" (UniqueName: \"kubernetes.io/projected/34acd7f6-6263-4871-892c-02835ebbab27-kube-api-access-4775b\") pod \"34acd7f6-6263-4871-892c-02835ebbab27\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.868539 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-inventory\") pod \"34acd7f6-6263-4871-892c-02835ebbab27\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.872848 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ceph" (OuterVolumeSpecName: "ceph") pod "34acd7f6-6263-4871-892c-02835ebbab27" (UID: "34acd7f6-6263-4871-892c-02835ebbab27"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.873300 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34acd7f6-6263-4871-892c-02835ebbab27-kube-api-access-4775b" (OuterVolumeSpecName: "kube-api-access-4775b") pod "34acd7f6-6263-4871-892c-02835ebbab27" (UID: "34acd7f6-6263-4871-892c-02835ebbab27"). InnerVolumeSpecName "kube-api-access-4775b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.895328 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "34acd7f6-6263-4871-892c-02835ebbab27" (UID: "34acd7f6-6263-4871-892c-02835ebbab27"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.899592 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-inventory" (OuterVolumeSpecName: "inventory") pod "34acd7f6-6263-4871-892c-02835ebbab27" (UID: "34acd7f6-6263-4871-892c-02835ebbab27"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.970868 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.970898 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4775b\" (UniqueName: \"kubernetes.io/projected/34acd7f6-6263-4871-892c-02835ebbab27-kube-api-access-4775b\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.970910 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.970922 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.196964 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b14b14c0-2e4e-420d-bdba-234de9130e4a" path="/var/lib/kubelet/pods/b14b14c0-2e4e-420d-bdba-234de9130e4a/volumes" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.330610 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" event={"ID":"34acd7f6-6263-4871-892c-02835ebbab27","Type":"ContainerDied","Data":"64dff72ef9bbd5903ef3f081c89b41c6a089a4023a0fbd63c67b48c1ad47875d"} Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.330651 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64dff72ef9bbd5903ef3f081c89b41c6a089a4023a0fbd63c67b48c1ad47875d" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.330690 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.399851 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk"] Mar 18 09:42:06 crc kubenswrapper[4778]: E0318 09:42:06.402437 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3895116-2d67-4e3c-9f3e-e04d3cfe0518" containerName="oc" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.402460 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3895116-2d67-4e3c-9f3e-e04d3cfe0518" containerName="oc" Mar 18 09:42:06 crc kubenswrapper[4778]: E0318 09:42:06.402474 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34acd7f6-6263-4871-892c-02835ebbab27" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.402505 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="34acd7f6-6263-4871-892c-02835ebbab27" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.402745 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3895116-2d67-4e3c-9f3e-e04d3cfe0518" containerName="oc" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.402770 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="34acd7f6-6263-4871-892c-02835ebbab27" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.403478 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.405673 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.405734 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.406055 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.406766 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.407760 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.412181 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk"] Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.478732 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.478801 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.479345 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6pd7\" (UniqueName: \"kubernetes.io/projected/4f5bf2d2-78b2-4358-a582-482ab3020da3-kube-api-access-p6pd7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.479508 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.581761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6pd7\" (UniqueName: \"kubernetes.io/projected/4f5bf2d2-78b2-4358-a582-482ab3020da3-kube-api-access-p6pd7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.581809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.581841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.581865 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.586931 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.595120 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.595162 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.597823 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6pd7\" (UniqueName: \"kubernetes.io/projected/4f5bf2d2-78b2-4358-a582-482ab3020da3-kube-api-access-p6pd7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.760915 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:07 crc kubenswrapper[4778]: I0318 09:42:07.244979 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk"] Mar 18 09:42:07 crc kubenswrapper[4778]: I0318 09:42:07.339463 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" event={"ID":"4f5bf2d2-78b2-4358-a582-482ab3020da3","Type":"ContainerStarted","Data":"44811a06483085aaef916ab0be67a0c9b5c5146057332aa0b5f277cb983c4586"} Mar 18 09:42:08 crc kubenswrapper[4778]: I0318 09:42:08.187629 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:42:08 crc kubenswrapper[4778]: E0318 09:42:08.188181 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:42:08 crc kubenswrapper[4778]: I0318 09:42:08.351587 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" event={"ID":"4f5bf2d2-78b2-4358-a582-482ab3020da3","Type":"ContainerStarted","Data":"dbcc106a61d73088c3c96f953c97d1eef401b54afb2440293b042938c4ada77f"} Mar 18 09:42:08 crc kubenswrapper[4778]: I0318 09:42:08.382937 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" podStartSLOduration=1.9631508709999999 podStartE2EDuration="2.382912517s" podCreationTimestamp="2026-03-18 09:42:06 +0000 UTC" firstStartedPulling="2026-03-18 09:42:07.245491804 +0000 UTC m=+2393.820236654" lastFinishedPulling="2026-03-18 09:42:07.66525344 +0000 UTC m=+2394.239998300" observedRunningTime="2026-03-18 09:42:08.376436752 +0000 UTC m=+2394.951181662" watchObservedRunningTime="2026-03-18 09:42:08.382912517 +0000 UTC m=+2394.957657397" Mar 18 09:42:19 crc kubenswrapper[4778]: I0318 09:42:19.187708 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:42:19 crc kubenswrapper[4778]: E0318 09:42:19.188782 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:42:30 crc kubenswrapper[4778]: I0318 09:42:30.187575 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:42:30 crc kubenswrapper[4778]: E0318 09:42:30.189440 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:42:32 crc kubenswrapper[4778]: I0318 09:42:32.217792 4778 scope.go:117] "RemoveContainer" containerID="037bb0f9fdf9935b25af1bbd8db6391c200ce1a888406ad48350f6fbf2f0253c" Mar 18 09:42:45 crc kubenswrapper[4778]: I0318 09:42:45.187911 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:42:45 crc kubenswrapper[4778]: E0318 09:42:45.189503 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:42:45 crc kubenswrapper[4778]: I0318 09:42:45.670544 4778 generic.go:334] "Generic (PLEG): container finished" podID="4f5bf2d2-78b2-4358-a582-482ab3020da3" containerID="dbcc106a61d73088c3c96f953c97d1eef401b54afb2440293b042938c4ada77f" exitCode=0 Mar 18 09:42:45 crc kubenswrapper[4778]: I0318 09:42:45.670603 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" event={"ID":"4f5bf2d2-78b2-4358-a582-482ab3020da3","Type":"ContainerDied","Data":"dbcc106a61d73088c3c96f953c97d1eef401b54afb2440293b042938c4ada77f"} Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.092918 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.212439 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-inventory\") pod \"4f5bf2d2-78b2-4358-a582-482ab3020da3\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.212508 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ssh-key-openstack-edpm-ipam\") pod \"4f5bf2d2-78b2-4358-a582-482ab3020da3\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.212595 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ceph\") pod \"4f5bf2d2-78b2-4358-a582-482ab3020da3\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.212645 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6pd7\" (UniqueName: \"kubernetes.io/projected/4f5bf2d2-78b2-4358-a582-482ab3020da3-kube-api-access-p6pd7\") pod \"4f5bf2d2-78b2-4358-a582-482ab3020da3\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.219516 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5bf2d2-78b2-4358-a582-482ab3020da3-kube-api-access-p6pd7" (OuterVolumeSpecName: "kube-api-access-p6pd7") pod "4f5bf2d2-78b2-4358-a582-482ab3020da3" (UID: "4f5bf2d2-78b2-4358-a582-482ab3020da3"). InnerVolumeSpecName "kube-api-access-p6pd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.219622 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ceph" (OuterVolumeSpecName: "ceph") pod "4f5bf2d2-78b2-4358-a582-482ab3020da3" (UID: "4f5bf2d2-78b2-4358-a582-482ab3020da3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.239357 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-inventory" (OuterVolumeSpecName: "inventory") pod "4f5bf2d2-78b2-4358-a582-482ab3020da3" (UID: "4f5bf2d2-78b2-4358-a582-482ab3020da3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.240249 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4f5bf2d2-78b2-4358-a582-482ab3020da3" (UID: "4f5bf2d2-78b2-4358-a582-482ab3020da3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.314732 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.314769 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6pd7\" (UniqueName: \"kubernetes.io/projected/4f5bf2d2-78b2-4358-a582-482ab3020da3-kube-api-access-p6pd7\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.314777 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.314787 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.692859 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" event={"ID":"4f5bf2d2-78b2-4358-a582-482ab3020da3","Type":"ContainerDied","Data":"44811a06483085aaef916ab0be67a0c9b5c5146057332aa0b5f277cb983c4586"} Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.692907 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44811a06483085aaef916ab0be67a0c9b5c5146057332aa0b5f277cb983c4586" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.692922 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.789500 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j74ts"] Mar 18 09:42:47 crc kubenswrapper[4778]: E0318 09:42:47.789846 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5bf2d2-78b2-4358-a582-482ab3020da3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.789863 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5bf2d2-78b2-4358-a582-482ab3020da3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.790040 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f5bf2d2-78b2-4358-a582-482ab3020da3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.790620 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.793645 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.793929 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.794603 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.799139 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.799655 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.803227 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j74ts"] Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.929372 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ceph\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.929843 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.930050 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.930247 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l272r\" (UniqueName: \"kubernetes.io/projected/53b18647-af19-457c-9543-2156c1ace738-kube-api-access-l272r\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.055063 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.055179 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.055343 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l272r\" (UniqueName: \"kubernetes.io/projected/53b18647-af19-457c-9543-2156c1ace738-kube-api-access-l272r\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.055465 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ceph\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.061074 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.062024 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ceph\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.062180 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.073111 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l272r\" (UniqueName: \"kubernetes.io/projected/53b18647-af19-457c-9543-2156c1ace738-kube-api-access-l272r\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.107750 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.651436 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j74ts"] Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.702087 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" event={"ID":"53b18647-af19-457c-9543-2156c1ace738","Type":"ContainerStarted","Data":"0fee9be56247863c641ddf6eb6613d75b7610950defcf4b488a7e3467f580b16"} Mar 18 09:42:49 crc kubenswrapper[4778]: I0318 09:42:49.712718 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" event={"ID":"53b18647-af19-457c-9543-2156c1ace738","Type":"ContainerStarted","Data":"332cb2eccc04e7ff5891be1d6080a18ce1ecf2c442a1afa15fca75f54c50a428"} Mar 18 09:42:58 crc kubenswrapper[4778]: I0318 09:42:58.187820 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:42:58 crc kubenswrapper[4778]: E0318 09:42:58.189153 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:42:58 crc kubenswrapper[4778]: I0318 09:42:58.828550 4778 generic.go:334] "Generic (PLEG): container finished" podID="53b18647-af19-457c-9543-2156c1ace738" containerID="332cb2eccc04e7ff5891be1d6080a18ce1ecf2c442a1afa15fca75f54c50a428" exitCode=0 Mar 18 09:42:58 crc kubenswrapper[4778]: I0318 09:42:58.828603 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" event={"ID":"53b18647-af19-457c-9543-2156c1ace738","Type":"ContainerDied","Data":"332cb2eccc04e7ff5891be1d6080a18ce1ecf2c442a1afa15fca75f54c50a428"} Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.259088 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.377494 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-inventory-0\") pod \"53b18647-af19-457c-9543-2156c1ace738\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.377745 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ceph\") pod \"53b18647-af19-457c-9543-2156c1ace738\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.377790 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ssh-key-openstack-edpm-ipam\") pod \"53b18647-af19-457c-9543-2156c1ace738\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.377831 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l272r\" (UniqueName: \"kubernetes.io/projected/53b18647-af19-457c-9543-2156c1ace738-kube-api-access-l272r\") pod \"53b18647-af19-457c-9543-2156c1ace738\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.382985 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ceph" (OuterVolumeSpecName: "ceph") pod "53b18647-af19-457c-9543-2156c1ace738" (UID: "53b18647-af19-457c-9543-2156c1ace738"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.386283 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b18647-af19-457c-9543-2156c1ace738-kube-api-access-l272r" (OuterVolumeSpecName: "kube-api-access-l272r") pod "53b18647-af19-457c-9543-2156c1ace738" (UID: "53b18647-af19-457c-9543-2156c1ace738"). InnerVolumeSpecName "kube-api-access-l272r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.402824 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "53b18647-af19-457c-9543-2156c1ace738" (UID: "53b18647-af19-457c-9543-2156c1ace738"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.402916 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "53b18647-af19-457c-9543-2156c1ace738" (UID: "53b18647-af19-457c-9543-2156c1ace738"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.481049 4778 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.481111 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.481132 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.481154 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l272r\" (UniqueName: \"kubernetes.io/projected/53b18647-af19-457c-9543-2156c1ace738-kube-api-access-l272r\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.849535 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" event={"ID":"53b18647-af19-457c-9543-2156c1ace738","Type":"ContainerDied","Data":"0fee9be56247863c641ddf6eb6613d75b7610950defcf4b488a7e3467f580b16"} Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.849584 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fee9be56247863c641ddf6eb6613d75b7610950defcf4b488a7e3467f580b16" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.849597 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.946837 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8"] Mar 18 09:43:00 crc kubenswrapper[4778]: E0318 09:43:00.947280 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b18647-af19-457c-9543-2156c1ace738" containerName="ssh-known-hosts-edpm-deployment" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.947304 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b18647-af19-457c-9543-2156c1ace738" containerName="ssh-known-hosts-edpm-deployment" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.947608 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="53b18647-af19-457c-9543-2156c1ace738" containerName="ssh-known-hosts-edpm-deployment" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.948886 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.953653 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.953924 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.954076 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.955362 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.955435 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.963646 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8"] Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.991588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.992148 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.992435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.992715 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr96f\" (UniqueName: \"kubernetes.io/projected/80a8d263-9bba-4db0-928e-f633b4ad5314-kube-api-access-lr96f\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.100532 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.100655 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.100706 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.102618 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr96f\" (UniqueName: \"kubernetes.io/projected/80a8d263-9bba-4db0-928e-f633b4ad5314-kube-api-access-lr96f\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.111086 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.120793 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.125576 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr96f\" (UniqueName: \"kubernetes.io/projected/80a8d263-9bba-4db0-928e-f633b4ad5314-kube-api-access-lr96f\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.135854 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.275834 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.806483 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8"] Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.813364 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.859904 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" event={"ID":"80a8d263-9bba-4db0-928e-f633b4ad5314","Type":"ContainerStarted","Data":"4183fc257c60cba8437e4d2ec61ceb6c06c87ebd92360ab71f51f3813dcba927"} Mar 18 09:43:02 crc kubenswrapper[4778]: I0318 09:43:02.871252 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" event={"ID":"80a8d263-9bba-4db0-928e-f633b4ad5314","Type":"ContainerStarted","Data":"45f5d946b0edf5c89ec37724c8001aa3f17b98318f01aa4199786bc97f369fdf"} Mar 18 09:43:02 crc kubenswrapper[4778]: I0318 09:43:02.902928 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" podStartSLOduration=2.293936868 podStartE2EDuration="2.902900749s" podCreationTimestamp="2026-03-18 09:43:00 +0000 UTC" firstStartedPulling="2026-03-18 09:43:01.813166722 +0000 UTC m=+2448.387911562" lastFinishedPulling="2026-03-18 09:43:02.422130573 +0000 UTC m=+2448.996875443" observedRunningTime="2026-03-18 09:43:02.894820189 +0000 UTC m=+2449.469565069" watchObservedRunningTime="2026-03-18 09:43:02.902900749 +0000 UTC m=+2449.477645619" Mar 18 09:43:09 crc kubenswrapper[4778]: I0318 09:43:09.937651 4778 generic.go:334] "Generic (PLEG): container finished" podID="80a8d263-9bba-4db0-928e-f633b4ad5314" containerID="45f5d946b0edf5c89ec37724c8001aa3f17b98318f01aa4199786bc97f369fdf" exitCode=0 Mar 18 09:43:09 crc kubenswrapper[4778]: I0318 09:43:09.937772 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" event={"ID":"80a8d263-9bba-4db0-928e-f633b4ad5314","Type":"ContainerDied","Data":"45f5d946b0edf5c89ec37724c8001aa3f17b98318f01aa4199786bc97f369fdf"} Mar 18 09:43:10 crc kubenswrapper[4778]: I0318 09:43:10.189360 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:43:10 crc kubenswrapper[4778]: E0318 09:43:10.189901 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.386112 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.425987 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr96f\" (UniqueName: \"kubernetes.io/projected/80a8d263-9bba-4db0-928e-f633b4ad5314-kube-api-access-lr96f\") pod \"80a8d263-9bba-4db0-928e-f633b4ad5314\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.426273 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ssh-key-openstack-edpm-ipam\") pod \"80a8d263-9bba-4db0-928e-f633b4ad5314\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.426365 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-inventory\") pod \"80a8d263-9bba-4db0-928e-f633b4ad5314\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.426419 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ceph\") pod \"80a8d263-9bba-4db0-928e-f633b4ad5314\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.433305 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a8d263-9bba-4db0-928e-f633b4ad5314-kube-api-access-lr96f" (OuterVolumeSpecName: "kube-api-access-lr96f") pod "80a8d263-9bba-4db0-928e-f633b4ad5314" (UID: "80a8d263-9bba-4db0-928e-f633b4ad5314"). InnerVolumeSpecName "kube-api-access-lr96f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.433350 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ceph" (OuterVolumeSpecName: "ceph") pod "80a8d263-9bba-4db0-928e-f633b4ad5314" (UID: "80a8d263-9bba-4db0-928e-f633b4ad5314"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.455110 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-inventory" (OuterVolumeSpecName: "inventory") pod "80a8d263-9bba-4db0-928e-f633b4ad5314" (UID: "80a8d263-9bba-4db0-928e-f633b4ad5314"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.456022 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "80a8d263-9bba-4db0-928e-f633b4ad5314" (UID: "80a8d263-9bba-4db0-928e-f633b4ad5314"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.528372 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr96f\" (UniqueName: \"kubernetes.io/projected/80a8d263-9bba-4db0-928e-f633b4ad5314-kube-api-access-lr96f\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.528419 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.528436 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.528450 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.965741 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" event={"ID":"80a8d263-9bba-4db0-928e-f633b4ad5314","Type":"ContainerDied","Data":"4183fc257c60cba8437e4d2ec61ceb6c06c87ebd92360ab71f51f3813dcba927"} Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.966320 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4183fc257c60cba8437e4d2ec61ceb6c06c87ebd92360ab71f51f3813dcba927" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.966050 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.054298 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7"] Mar 18 09:43:12 crc kubenswrapper[4778]: E0318 09:43:12.054879 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a8d263-9bba-4db0-928e-f633b4ad5314" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.054905 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a8d263-9bba-4db0-928e-f633b4ad5314" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.055423 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a8d263-9bba-4db0-928e-f633b4ad5314" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.056401 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.059915 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.060057 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.060094 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.060129 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.060311 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.067386 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7"] Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.144795 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.145171 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzwll\" (UniqueName: \"kubernetes.io/projected/613d0a31-a371-4c66-8254-85a7cc864fd0-kube-api-access-gzwll\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.145294 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.145401 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.248761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.249128 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzwll\" (UniqueName: \"kubernetes.io/projected/613d0a31-a371-4c66-8254-85a7cc864fd0-kube-api-access-gzwll\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.249189 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.250362 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.255269 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.255994 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.262886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.267177 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzwll\" (UniqueName: \"kubernetes.io/projected/613d0a31-a371-4c66-8254-85a7cc864fd0-kube-api-access-gzwll\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.380267 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.940755 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7"] Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.975400 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" event={"ID":"613d0a31-a371-4c66-8254-85a7cc864fd0","Type":"ContainerStarted","Data":"5f314f77b9dfdc3bdb927436906d3194034e3084dc373fdaea997b04d7c042ff"} Mar 18 09:43:13 crc kubenswrapper[4778]: I0318 09:43:13.985924 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" event={"ID":"613d0a31-a371-4c66-8254-85a7cc864fd0","Type":"ContainerStarted","Data":"2de490467d479cfa037c02d250eacac3f30e179767b6a79d7df1d0233e63150e"} Mar 18 09:43:14 crc kubenswrapper[4778]: I0318 09:43:14.000959 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" podStartSLOduration=1.4068783790000001 podStartE2EDuration="2.000940136s" podCreationTimestamp="2026-03-18 09:43:12 +0000 UTC" firstStartedPulling="2026-03-18 09:43:12.939451117 +0000 UTC m=+2459.514195957" lastFinishedPulling="2026-03-18 09:43:13.533512834 +0000 UTC m=+2460.108257714" observedRunningTime="2026-03-18 09:43:13.999215109 +0000 UTC m=+2460.573959969" watchObservedRunningTime="2026-03-18 09:43:14.000940136 +0000 UTC m=+2460.575684976" Mar 18 09:43:23 crc kubenswrapper[4778]: I0318 09:43:23.075680 4778 generic.go:334] "Generic (PLEG): container finished" podID="613d0a31-a371-4c66-8254-85a7cc864fd0" containerID="2de490467d479cfa037c02d250eacac3f30e179767b6a79d7df1d0233e63150e" exitCode=0 Mar 18 09:43:23 crc kubenswrapper[4778]: I0318 09:43:23.075790 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" event={"ID":"613d0a31-a371-4c66-8254-85a7cc864fd0","Type":"ContainerDied","Data":"2de490467d479cfa037c02d250eacac3f30e179767b6a79d7df1d0233e63150e"} Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.201519 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:43:24 crc kubenswrapper[4778]: E0318 09:43:24.202036 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.537415 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.711245 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ssh-key-openstack-edpm-ipam\") pod \"613d0a31-a371-4c66-8254-85a7cc864fd0\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.711325 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ceph\") pod \"613d0a31-a371-4c66-8254-85a7cc864fd0\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.711356 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-inventory\") pod \"613d0a31-a371-4c66-8254-85a7cc864fd0\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.711478 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzwll\" (UniqueName: \"kubernetes.io/projected/613d0a31-a371-4c66-8254-85a7cc864fd0-kube-api-access-gzwll\") pod \"613d0a31-a371-4c66-8254-85a7cc864fd0\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.722458 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ceph" (OuterVolumeSpecName: "ceph") pod "613d0a31-a371-4c66-8254-85a7cc864fd0" (UID: "613d0a31-a371-4c66-8254-85a7cc864fd0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.723414 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/613d0a31-a371-4c66-8254-85a7cc864fd0-kube-api-access-gzwll" (OuterVolumeSpecName: "kube-api-access-gzwll") pod "613d0a31-a371-4c66-8254-85a7cc864fd0" (UID: "613d0a31-a371-4c66-8254-85a7cc864fd0"). InnerVolumeSpecName "kube-api-access-gzwll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.739876 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-inventory" (OuterVolumeSpecName: "inventory") pod "613d0a31-a371-4c66-8254-85a7cc864fd0" (UID: "613d0a31-a371-4c66-8254-85a7cc864fd0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.759799 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "613d0a31-a371-4c66-8254-85a7cc864fd0" (UID: "613d0a31-a371-4c66-8254-85a7cc864fd0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.814641 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.814697 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.814708 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.814719 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzwll\" (UniqueName: \"kubernetes.io/projected/613d0a31-a371-4c66-8254-85a7cc864fd0-kube-api-access-gzwll\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.111704 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" event={"ID":"613d0a31-a371-4c66-8254-85a7cc864fd0","Type":"ContainerDied","Data":"5f314f77b9dfdc3bdb927436906d3194034e3084dc373fdaea997b04d7c042ff"} Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.112253 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f314f77b9dfdc3bdb927436906d3194034e3084dc373fdaea997b04d7c042ff" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.111879 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.241629 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd"] Mar 18 09:43:25 crc kubenswrapper[4778]: E0318 09:43:25.242130 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613d0a31-a371-4c66-8254-85a7cc864fd0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.242148 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="613d0a31-a371-4c66-8254-85a7cc864fd0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.242400 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="613d0a31-a371-4c66-8254-85a7cc864fd0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.243287 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.247142 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.247818 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.248272 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.253092 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.254103 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.254835 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.255234 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.255363 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.261807 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd"] Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.429785 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.430566 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.430902 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.431166 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-kube-api-access-66rh6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.431413 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.431628 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.431866 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.432049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.432259 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.432439 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.432637 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.432825 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.432988 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.535413 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.535883 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536130 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-kube-api-access-66rh6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536297 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536414 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536538 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536688 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536917 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.537049 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.537157 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.537334 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.537496 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.540152 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.541070 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.541690 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.544230 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.544865 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.546535 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.546676 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.547401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.548547 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.548924 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.550538 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.556110 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.568940 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-kube-api-access-66rh6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.863523 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:26 crc kubenswrapper[4778]: I0318 09:43:26.442983 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd"] Mar 18 09:43:27 crc kubenswrapper[4778]: I0318 09:43:27.134467 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" event={"ID":"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5","Type":"ContainerStarted","Data":"c2db1237b9f4739f87209ae21a67408657e301817a1a279b85ac382dd5fae289"} Mar 18 09:43:28 crc kubenswrapper[4778]: I0318 09:43:28.147874 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" event={"ID":"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5","Type":"ContainerStarted","Data":"b744a025e93192ec362ad8429af879e71a000131cc9b2858681fa40afa9f7623"} Mar 18 09:43:28 crc kubenswrapper[4778]: I0318 09:43:28.193473 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" podStartSLOduration=2.722591207 podStartE2EDuration="3.193440902s" podCreationTimestamp="2026-03-18 09:43:25 +0000 UTC" firstStartedPulling="2026-03-18 09:43:26.446518241 +0000 UTC m=+2473.021263081" lastFinishedPulling="2026-03-18 09:43:26.917367926 +0000 UTC m=+2473.492112776" observedRunningTime="2026-03-18 09:43:28.181911979 +0000 UTC m=+2474.756656889" watchObservedRunningTime="2026-03-18 09:43:28.193440902 +0000 UTC m=+2474.768185782" Mar 18 09:43:38 crc kubenswrapper[4778]: I0318 09:43:38.187561 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:43:38 crc kubenswrapper[4778]: E0318 09:43:38.188874 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:43:51 crc kubenswrapper[4778]: I0318 09:43:51.187031 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:43:51 crc kubenswrapper[4778]: E0318 09:43:51.187984 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:43:56 crc kubenswrapper[4778]: I0318 09:43:56.422767 4778 generic.go:334] "Generic (PLEG): container finished" podID="7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" containerID="b744a025e93192ec362ad8429af879e71a000131cc9b2858681fa40afa9f7623" exitCode=0 Mar 18 09:43:56 crc kubenswrapper[4778]: I0318 09:43:56.422837 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" event={"ID":"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5","Type":"ContainerDied","Data":"b744a025e93192ec362ad8429af879e71a000131cc9b2858681fa40afa9f7623"} Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.833827 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.891946 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ovn-combined-ca-bundle\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892118 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892163 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892219 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-libvirt-combined-ca-bundle\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892260 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ceph\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892286 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-inventory\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892306 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892347 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-kube-api-access-66rh6\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892369 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ssh-key-openstack-edpm-ipam\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892424 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-repo-setup-combined-ca-bundle\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892449 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-neutron-metadata-combined-ca-bundle\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892474 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-bootstrap-combined-ca-bundle\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892571 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-nova-combined-ca-bundle\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.898968 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-kube-api-access-66rh6" (OuterVolumeSpecName: "kube-api-access-66rh6") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "kube-api-access-66rh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.899302 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.899783 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.899782 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.900559 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.901189 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.901394 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.901720 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ceph" (OuterVolumeSpecName: "ceph") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.902304 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.903614 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.907641 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.934403 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.937993 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-inventory" (OuterVolumeSpecName: "inventory") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994785 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994831 4778 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994846 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994860 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994874 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994886 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-kube-api-access-66rh6\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994899 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994912 4778 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994924 4778 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994936 4778 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994948 4778 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994960 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994973 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.447390 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" event={"ID":"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5","Type":"ContainerDied","Data":"c2db1237b9f4739f87209ae21a67408657e301817a1a279b85ac382dd5fae289"} Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.447450 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2db1237b9f4739f87209ae21a67408657e301817a1a279b85ac382dd5fae289" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.447519 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.566229 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv"] Mar 18 09:43:58 crc kubenswrapper[4778]: E0318 09:43:58.566949 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.567094 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.567434 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.568249 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.573474 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.573474 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.574174 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.574449 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.574664 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.589483 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv"] Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.710906 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.711126 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.711337 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.711421 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzp4p\" (UniqueName: \"kubernetes.io/projected/fed5a515-ed14-40f1-9282-4e87fe319bf6-kube-api-access-bzp4p\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.814258 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.814434 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.814517 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzp4p\" (UniqueName: \"kubernetes.io/projected/fed5a515-ed14-40f1-9282-4e87fe319bf6-kube-api-access-bzp4p\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.814579 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.820716 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.820754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.824614 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.833181 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzp4p\" (UniqueName: \"kubernetes.io/projected/fed5a515-ed14-40f1-9282-4e87fe319bf6-kube-api-access-bzp4p\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.910174 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:59 crc kubenswrapper[4778]: I0318 09:43:59.532665 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv"] Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.142152 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563784-pdds9"] Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.143632 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.145660 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.145784 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.145979 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.157679 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563784-pdds9"] Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.242367 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqkzb\" (UniqueName: \"kubernetes.io/projected/ab4f60ce-be48-4052-9fa7-905b70e65c3a-kube-api-access-nqkzb\") pod \"auto-csr-approver-29563784-pdds9\" (UID: \"ab4f60ce-be48-4052-9fa7-905b70e65c3a\") " pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.343877 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqkzb\" (UniqueName: \"kubernetes.io/projected/ab4f60ce-be48-4052-9fa7-905b70e65c3a-kube-api-access-nqkzb\") pod \"auto-csr-approver-29563784-pdds9\" (UID: \"ab4f60ce-be48-4052-9fa7-905b70e65c3a\") " pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.373647 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqkzb\" (UniqueName: \"kubernetes.io/projected/ab4f60ce-be48-4052-9fa7-905b70e65c3a-kube-api-access-nqkzb\") pod \"auto-csr-approver-29563784-pdds9\" (UID: \"ab4f60ce-be48-4052-9fa7-905b70e65c3a\") " pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.464823 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.470105 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" event={"ID":"fed5a515-ed14-40f1-9282-4e87fe319bf6","Type":"ContainerStarted","Data":"2804ae64e5a590f30623aba762efec2399df5cd3bd84f7d7167e025650c8d12c"} Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.470142 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" event={"ID":"fed5a515-ed14-40f1-9282-4e87fe319bf6","Type":"ContainerStarted","Data":"96de5a1856c48d9ae2a5c28917debb3937af347a9bc8b2632de45457e8977720"} Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.511864 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" podStartSLOduration=2.036255582 podStartE2EDuration="2.511838647s" podCreationTimestamp="2026-03-18 09:43:58 +0000 UTC" firstStartedPulling="2026-03-18 09:43:59.536667705 +0000 UTC m=+2506.111412545" lastFinishedPulling="2026-03-18 09:44:00.01225077 +0000 UTC m=+2506.586995610" observedRunningTime="2026-03-18 09:44:00.497082616 +0000 UTC m=+2507.071827536" watchObservedRunningTime="2026-03-18 09:44:00.511838647 +0000 UTC m=+2507.086583497" Mar 18 09:44:00 crc kubenswrapper[4778]: W0318 09:44:00.999463 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab4f60ce_be48_4052_9fa7_905b70e65c3a.slice/crio-4db45ae02348a6f3c6a84ec8f999c7bd96e6ba2c870cdd5ed8bf4e627f0f149f WatchSource:0}: Error finding container 4db45ae02348a6f3c6a84ec8f999c7bd96e6ba2c870cdd5ed8bf4e627f0f149f: Status 404 returned error can't find the container with id 4db45ae02348a6f3c6a84ec8f999c7bd96e6ba2c870cdd5ed8bf4e627f0f149f Mar 18 09:44:01 crc kubenswrapper[4778]: I0318 09:44:01.000620 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563784-pdds9"] Mar 18 09:44:01 crc kubenswrapper[4778]: I0318 09:44:01.480181 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563784-pdds9" event={"ID":"ab4f60ce-be48-4052-9fa7-905b70e65c3a","Type":"ContainerStarted","Data":"4db45ae02348a6f3c6a84ec8f999c7bd96e6ba2c870cdd5ed8bf4e627f0f149f"} Mar 18 09:44:02 crc kubenswrapper[4778]: I0318 09:44:02.488974 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563784-pdds9" event={"ID":"ab4f60ce-be48-4052-9fa7-905b70e65c3a","Type":"ContainerStarted","Data":"8f06326ea2269b974cec86640f654b1a9c686fe2bd62106254f2a194a59e658e"} Mar 18 09:44:02 crc kubenswrapper[4778]: I0318 09:44:02.506937 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563784-pdds9" podStartSLOduration=1.433092322 podStartE2EDuration="2.506920836s" podCreationTimestamp="2026-03-18 09:44:00 +0000 UTC" firstStartedPulling="2026-03-18 09:44:01.003299023 +0000 UTC m=+2507.578043873" lastFinishedPulling="2026-03-18 09:44:02.077127527 +0000 UTC m=+2508.651872387" observedRunningTime="2026-03-18 09:44:02.501806937 +0000 UTC m=+2509.076551787" watchObservedRunningTime="2026-03-18 09:44:02.506920836 +0000 UTC m=+2509.081665676" Mar 18 09:44:03 crc kubenswrapper[4778]: I0318 09:44:03.497951 4778 generic.go:334] "Generic (PLEG): container finished" podID="ab4f60ce-be48-4052-9fa7-905b70e65c3a" containerID="8f06326ea2269b974cec86640f654b1a9c686fe2bd62106254f2a194a59e658e" exitCode=0 Mar 18 09:44:03 crc kubenswrapper[4778]: I0318 09:44:03.498063 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563784-pdds9" event={"ID":"ab4f60ce-be48-4052-9fa7-905b70e65c3a","Type":"ContainerDied","Data":"8f06326ea2269b974cec86640f654b1a9c686fe2bd62106254f2a194a59e658e"} Mar 18 09:44:04 crc kubenswrapper[4778]: I0318 09:44:04.874049 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:04 crc kubenswrapper[4778]: I0318 09:44:04.971685 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqkzb\" (UniqueName: \"kubernetes.io/projected/ab4f60ce-be48-4052-9fa7-905b70e65c3a-kube-api-access-nqkzb\") pod \"ab4f60ce-be48-4052-9fa7-905b70e65c3a\" (UID: \"ab4f60ce-be48-4052-9fa7-905b70e65c3a\") " Mar 18 09:44:04 crc kubenswrapper[4778]: I0318 09:44:04.978642 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab4f60ce-be48-4052-9fa7-905b70e65c3a-kube-api-access-nqkzb" (OuterVolumeSpecName: "kube-api-access-nqkzb") pod "ab4f60ce-be48-4052-9fa7-905b70e65c3a" (UID: "ab4f60ce-be48-4052-9fa7-905b70e65c3a"). InnerVolumeSpecName "kube-api-access-nqkzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.074380 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqkzb\" (UniqueName: \"kubernetes.io/projected/ab4f60ce-be48-4052-9fa7-905b70e65c3a-kube-api-access-nqkzb\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.528679 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563784-pdds9" event={"ID":"ab4f60ce-be48-4052-9fa7-905b70e65c3a","Type":"ContainerDied","Data":"4db45ae02348a6f3c6a84ec8f999c7bd96e6ba2c870cdd5ed8bf4e627f0f149f"} Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.529178 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db45ae02348a6f3c6a84ec8f999c7bd96e6ba2c870cdd5ed8bf4e627f0f149f" Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.529344 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.536375 4778 generic.go:334] "Generic (PLEG): container finished" podID="fed5a515-ed14-40f1-9282-4e87fe319bf6" containerID="2804ae64e5a590f30623aba762efec2399df5cd3bd84f7d7167e025650c8d12c" exitCode=0 Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.536457 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" event={"ID":"fed5a515-ed14-40f1-9282-4e87fe319bf6","Type":"ContainerDied","Data":"2804ae64e5a590f30623aba762efec2399df5cd3bd84f7d7167e025650c8d12c"} Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.605493 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563778-4dm5p"] Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.617966 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563778-4dm5p"] Mar 18 09:44:06 crc kubenswrapper[4778]: I0318 09:44:06.187614 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:44:06 crc kubenswrapper[4778]: E0318 09:44:06.188240 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:44:06 crc kubenswrapper[4778]: I0318 09:44:06.198831 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0548485b-4f03-47ba-8a13-4e3522451291" path="/var/lib/kubelet/pods/0548485b-4f03-47ba-8a13-4e3522451291/volumes" Mar 18 09:44:06 crc kubenswrapper[4778]: I0318 09:44:06.904039 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.011784 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-inventory\") pod \"fed5a515-ed14-40f1-9282-4e87fe319bf6\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.011900 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzp4p\" (UniqueName: \"kubernetes.io/projected/fed5a515-ed14-40f1-9282-4e87fe319bf6-kube-api-access-bzp4p\") pod \"fed5a515-ed14-40f1-9282-4e87fe319bf6\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.011930 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ceph\") pod \"fed5a515-ed14-40f1-9282-4e87fe319bf6\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.012077 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ssh-key-openstack-edpm-ipam\") pod \"fed5a515-ed14-40f1-9282-4e87fe319bf6\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.017763 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ceph" (OuterVolumeSpecName: "ceph") pod "fed5a515-ed14-40f1-9282-4e87fe319bf6" (UID: "fed5a515-ed14-40f1-9282-4e87fe319bf6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.023595 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed5a515-ed14-40f1-9282-4e87fe319bf6-kube-api-access-bzp4p" (OuterVolumeSpecName: "kube-api-access-bzp4p") pod "fed5a515-ed14-40f1-9282-4e87fe319bf6" (UID: "fed5a515-ed14-40f1-9282-4e87fe319bf6"). InnerVolumeSpecName "kube-api-access-bzp4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.042863 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fed5a515-ed14-40f1-9282-4e87fe319bf6" (UID: "fed5a515-ed14-40f1-9282-4e87fe319bf6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.056217 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-inventory" (OuterVolumeSpecName: "inventory") pod "fed5a515-ed14-40f1-9282-4e87fe319bf6" (UID: "fed5a515-ed14-40f1-9282-4e87fe319bf6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.114714 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.114755 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzp4p\" (UniqueName: \"kubernetes.io/projected/fed5a515-ed14-40f1-9282-4e87fe319bf6-kube-api-access-bzp4p\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.114770 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.114782 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.559547 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" event={"ID":"fed5a515-ed14-40f1-9282-4e87fe319bf6","Type":"ContainerDied","Data":"96de5a1856c48d9ae2a5c28917debb3937af347a9bc8b2632de45457e8977720"} Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.559606 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96de5a1856c48d9ae2a5c28917debb3937af347a9bc8b2632de45457e8977720" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.559678 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.662538 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd"] Mar 18 09:44:07 crc kubenswrapper[4778]: E0318 09:44:07.662997 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed5a515-ed14-40f1-9282-4e87fe319bf6" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.663025 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed5a515-ed14-40f1-9282-4e87fe319bf6" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 18 09:44:07 crc kubenswrapper[4778]: E0318 09:44:07.663040 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4f60ce-be48-4052-9fa7-905b70e65c3a" containerName="oc" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.663049 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4f60ce-be48-4052-9fa7-905b70e65c3a" containerName="oc" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.663296 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed5a515-ed14-40f1-9282-4e87fe319bf6" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.663330 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab4f60ce-be48-4052-9fa7-905b70e65c3a" containerName="oc" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.665126 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.667596 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.667910 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.670414 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.678661 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.678704 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.678661 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.694329 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd"] Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.826588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8df\" (UniqueName: \"kubernetes.io/projected/1f0f4177-ad12-4848-bbd7-39b004344cb3-kube-api-access-fc8df\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.826682 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.826778 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.826815 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.826847 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.827025 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.929357 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.929478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.929567 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.930585 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.930710 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8df\" (UniqueName: \"kubernetes.io/projected/1f0f4177-ad12-4848-bbd7-39b004344cb3-kube-api-access-fc8df\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.930768 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.932459 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.935819 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.936324 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.936694 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.938455 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.966454 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8df\" (UniqueName: \"kubernetes.io/projected/1f0f4177-ad12-4848-bbd7-39b004344cb3-kube-api-access-fc8df\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.994448 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:08 crc kubenswrapper[4778]: I0318 09:44:08.564888 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd"] Mar 18 09:44:09 crc kubenswrapper[4778]: I0318 09:44:09.577317 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" event={"ID":"1f0f4177-ad12-4848-bbd7-39b004344cb3","Type":"ContainerStarted","Data":"c96da0fdc9f23d1c8174300e8944755e5546994203de0c9b38e19a45beb705b3"} Mar 18 09:44:09 crc kubenswrapper[4778]: I0318 09:44:09.578719 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" event={"ID":"1f0f4177-ad12-4848-bbd7-39b004344cb3","Type":"ContainerStarted","Data":"54b76851245c233c2784f282b1ca1eb9cfa025c851c32932417d057083ffca1c"} Mar 18 09:44:09 crc kubenswrapper[4778]: I0318 09:44:09.607622 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" podStartSLOduration=2.10382655 podStartE2EDuration="2.607606851s" podCreationTimestamp="2026-03-18 09:44:07 +0000 UTC" firstStartedPulling="2026-03-18 09:44:08.572044948 +0000 UTC m=+2515.146789798" lastFinishedPulling="2026-03-18 09:44:09.075825249 +0000 UTC m=+2515.650570099" observedRunningTime="2026-03-18 09:44:09.605831124 +0000 UTC m=+2516.180576004" watchObservedRunningTime="2026-03-18 09:44:09.607606851 +0000 UTC m=+2516.182351681" Mar 18 09:44:20 crc kubenswrapper[4778]: I0318 09:44:20.188078 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:44:20 crc kubenswrapper[4778]: E0318 09:44:20.189004 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:44:31 crc kubenswrapper[4778]: I0318 09:44:31.187121 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:44:31 crc kubenswrapper[4778]: E0318 09:44:31.188270 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:44:32 crc kubenswrapper[4778]: I0318 09:44:32.328588 4778 scope.go:117] "RemoveContainer" containerID="f6adcd9d5f24124681eed0d00263f7ac4a19be40ad724c067b9849cb1ce141e4" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.724554 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ftnlp"] Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.729639 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.736564 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftnlp"] Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.826430 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-catalog-content\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.826495 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-utilities\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.826665 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxnqx\" (UniqueName: \"kubernetes.io/projected/76a702d2-54ab-444c-bf6c-cc815acef4d7-kube-api-access-vxnqx\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.928459 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxnqx\" (UniqueName: \"kubernetes.io/projected/76a702d2-54ab-444c-bf6c-cc815acef4d7-kube-api-access-vxnqx\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.928644 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-catalog-content\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.928668 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-utilities\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.929374 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-catalog-content\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.929506 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-utilities\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.952939 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxnqx\" (UniqueName: \"kubernetes.io/projected/76a702d2-54ab-444c-bf6c-cc815acef4d7-kube-api-access-vxnqx\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:35 crc kubenswrapper[4778]: I0318 09:44:35.049436 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:35 crc kubenswrapper[4778]: I0318 09:44:35.551823 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftnlp"] Mar 18 09:44:35 crc kubenswrapper[4778]: I0318 09:44:35.830673 4778 generic.go:334] "Generic (PLEG): container finished" podID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerID="ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e" exitCode=0 Mar 18 09:44:35 crc kubenswrapper[4778]: I0318 09:44:35.830789 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftnlp" event={"ID":"76a702d2-54ab-444c-bf6c-cc815acef4d7","Type":"ContainerDied","Data":"ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e"} Mar 18 09:44:35 crc kubenswrapper[4778]: I0318 09:44:35.831027 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftnlp" event={"ID":"76a702d2-54ab-444c-bf6c-cc815acef4d7","Type":"ContainerStarted","Data":"3e1b274b36934e80c0a1d5e469bbaee10925f2146c30a4ddf513987aa5e061ef"} Mar 18 09:44:36 crc kubenswrapper[4778]: I0318 09:44:36.844458 4778 generic.go:334] "Generic (PLEG): container finished" podID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerID="f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900" exitCode=0 Mar 18 09:44:36 crc kubenswrapper[4778]: I0318 09:44:36.844559 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftnlp" event={"ID":"76a702d2-54ab-444c-bf6c-cc815acef4d7","Type":"ContainerDied","Data":"f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900"} Mar 18 09:44:37 crc kubenswrapper[4778]: I0318 09:44:37.878620 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftnlp" event={"ID":"76a702d2-54ab-444c-bf6c-cc815acef4d7","Type":"ContainerStarted","Data":"79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c"} Mar 18 09:44:37 crc kubenswrapper[4778]: I0318 09:44:37.903673 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ftnlp" podStartSLOduration=2.439522425 podStartE2EDuration="3.903654503s" podCreationTimestamp="2026-03-18 09:44:34 +0000 UTC" firstStartedPulling="2026-03-18 09:44:35.832210688 +0000 UTC m=+2542.406955528" lastFinishedPulling="2026-03-18 09:44:37.296342766 +0000 UTC m=+2543.871087606" observedRunningTime="2026-03-18 09:44:37.899398527 +0000 UTC m=+2544.474143367" watchObservedRunningTime="2026-03-18 09:44:37.903654503 +0000 UTC m=+2544.478399343" Mar 18 09:44:42 crc kubenswrapper[4778]: I0318 09:44:42.189317 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:44:42 crc kubenswrapper[4778]: E0318 09:44:42.190599 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:44:45 crc kubenswrapper[4778]: I0318 09:44:45.050252 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:45 crc kubenswrapper[4778]: I0318 09:44:45.050560 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:45 crc kubenswrapper[4778]: I0318 09:44:45.125364 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:46 crc kubenswrapper[4778]: I0318 09:44:46.029304 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:46 crc kubenswrapper[4778]: I0318 09:44:46.070719 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftnlp"] Mar 18 09:44:47 crc kubenswrapper[4778]: I0318 09:44:47.998112 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ftnlp" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="registry-server" containerID="cri-o://79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c" gracePeriod=2 Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.465443 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.590166 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxnqx\" (UniqueName: \"kubernetes.io/projected/76a702d2-54ab-444c-bf6c-cc815acef4d7-kube-api-access-vxnqx\") pod \"76a702d2-54ab-444c-bf6c-cc815acef4d7\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.590564 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-catalog-content\") pod \"76a702d2-54ab-444c-bf6c-cc815acef4d7\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.590801 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-utilities\") pod \"76a702d2-54ab-444c-bf6c-cc815acef4d7\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.592491 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-utilities" (OuterVolumeSpecName: "utilities") pod "76a702d2-54ab-444c-bf6c-cc815acef4d7" (UID: "76a702d2-54ab-444c-bf6c-cc815acef4d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.595006 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a702d2-54ab-444c-bf6c-cc815acef4d7-kube-api-access-vxnqx" (OuterVolumeSpecName: "kube-api-access-vxnqx") pod "76a702d2-54ab-444c-bf6c-cc815acef4d7" (UID: "76a702d2-54ab-444c-bf6c-cc815acef4d7"). InnerVolumeSpecName "kube-api-access-vxnqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.665171 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76a702d2-54ab-444c-bf6c-cc815acef4d7" (UID: "76a702d2-54ab-444c-bf6c-cc815acef4d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.693533 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.693571 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxnqx\" (UniqueName: \"kubernetes.io/projected/76a702d2-54ab-444c-bf6c-cc815acef4d7-kube-api-access-vxnqx\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.693588 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.028110 4778 generic.go:334] "Generic (PLEG): container finished" podID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerID="79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c" exitCode=0 Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.028151 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftnlp" event={"ID":"76a702d2-54ab-444c-bf6c-cc815acef4d7","Type":"ContainerDied","Data":"79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c"} Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.028179 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftnlp" event={"ID":"76a702d2-54ab-444c-bf6c-cc815acef4d7","Type":"ContainerDied","Data":"3e1b274b36934e80c0a1d5e469bbaee10925f2146c30a4ddf513987aa5e061ef"} Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.028179 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.028248 4778 scope.go:117] "RemoveContainer" containerID="79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.061334 4778 scope.go:117] "RemoveContainer" containerID="f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.068222 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftnlp"] Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.077214 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ftnlp"] Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.086067 4778 scope.go:117] "RemoveContainer" containerID="ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.121938 4778 scope.go:117] "RemoveContainer" containerID="79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c" Mar 18 09:44:49 crc kubenswrapper[4778]: E0318 09:44:49.122353 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c\": container with ID starting with 79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c not found: ID does not exist" containerID="79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.122406 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c"} err="failed to get container status \"79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c\": rpc error: code = NotFound desc = could not find container \"79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c\": container with ID starting with 79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c not found: ID does not exist" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.122436 4778 scope.go:117] "RemoveContainer" containerID="f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900" Mar 18 09:44:49 crc kubenswrapper[4778]: E0318 09:44:49.122767 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900\": container with ID starting with f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900 not found: ID does not exist" containerID="f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.122804 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900"} err="failed to get container status \"f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900\": rpc error: code = NotFound desc = could not find container \"f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900\": container with ID starting with f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900 not found: ID does not exist" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.122827 4778 scope.go:117] "RemoveContainer" containerID="ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e" Mar 18 09:44:49 crc kubenswrapper[4778]: E0318 09:44:49.123095 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e\": container with ID starting with ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e not found: ID does not exist" containerID="ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.123124 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e"} err="failed to get container status \"ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e\": rpc error: code = NotFound desc = could not find container \"ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e\": container with ID starting with ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e not found: ID does not exist" Mar 18 09:44:50 crc kubenswrapper[4778]: I0318 09:44:50.207455 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" path="/var/lib/kubelet/pods/76a702d2-54ab-444c-bf6c-cc815acef4d7/volumes" Mar 18 09:44:53 crc kubenswrapper[4778]: I0318 09:44:53.188076 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:44:53 crc kubenswrapper[4778]: E0318 09:44:53.189242 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.172532 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq"] Mar 18 09:45:00 crc kubenswrapper[4778]: E0318 09:45:00.173415 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="extract-utilities" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.173431 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="extract-utilities" Mar 18 09:45:00 crc kubenswrapper[4778]: E0318 09:45:00.173455 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="registry-server" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.173463 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="registry-server" Mar 18 09:45:00 crc kubenswrapper[4778]: E0318 09:45:00.173492 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="extract-content" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.173499 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="extract-content" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.173717 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="registry-server" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.174520 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.177250 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.184466 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.204342 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq"] Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.261168 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956ed194-df94-4b74-919f-9cdcfbdcf5a7-secret-volume\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.261427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxkdc\" (UniqueName: \"kubernetes.io/projected/956ed194-df94-4b74-919f-9cdcfbdcf5a7-kube-api-access-hxkdc\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.261455 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956ed194-df94-4b74-919f-9cdcfbdcf5a7-config-volume\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.362734 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxkdc\" (UniqueName: \"kubernetes.io/projected/956ed194-df94-4b74-919f-9cdcfbdcf5a7-kube-api-access-hxkdc\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.362778 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956ed194-df94-4b74-919f-9cdcfbdcf5a7-config-volume\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.362809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956ed194-df94-4b74-919f-9cdcfbdcf5a7-secret-volume\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.364223 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956ed194-df94-4b74-919f-9cdcfbdcf5a7-config-volume\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.367883 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956ed194-df94-4b74-919f-9cdcfbdcf5a7-secret-volume\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.381995 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxkdc\" (UniqueName: \"kubernetes.io/projected/956ed194-df94-4b74-919f-9cdcfbdcf5a7-kube-api-access-hxkdc\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.512733 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.950878 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq"] Mar 18 09:45:01 crc kubenswrapper[4778]: I0318 09:45:01.183328 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" event={"ID":"956ed194-df94-4b74-919f-9cdcfbdcf5a7","Type":"ContainerStarted","Data":"b6a6fd51a98d9937da03ae4682cc5b4ae715e8495f9ae8fc3459feb811d9d2fc"} Mar 18 09:45:01 crc kubenswrapper[4778]: I0318 09:45:01.183658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" event={"ID":"956ed194-df94-4b74-919f-9cdcfbdcf5a7","Type":"ContainerStarted","Data":"50d5642833b7d2fab625e9ecbe8af9feca4613d9050cdea2f10325e0597cb421"} Mar 18 09:45:01 crc kubenswrapper[4778]: I0318 09:45:01.205375 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" podStartSLOduration=1.205356017 podStartE2EDuration="1.205356017s" podCreationTimestamp="2026-03-18 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:45:01.19775446 +0000 UTC m=+2567.772499310" watchObservedRunningTime="2026-03-18 09:45:01.205356017 +0000 UTC m=+2567.780100857" Mar 18 09:45:02 crc kubenswrapper[4778]: I0318 09:45:02.194395 4778 generic.go:334] "Generic (PLEG): container finished" podID="956ed194-df94-4b74-919f-9cdcfbdcf5a7" containerID="b6a6fd51a98d9937da03ae4682cc5b4ae715e8495f9ae8fc3459feb811d9d2fc" exitCode=0 Mar 18 09:45:02 crc kubenswrapper[4778]: I0318 09:45:02.197394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" event={"ID":"956ed194-df94-4b74-919f-9cdcfbdcf5a7","Type":"ContainerDied","Data":"b6a6fd51a98d9937da03ae4682cc5b4ae715e8495f9ae8fc3459feb811d9d2fc"} Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.586404 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.638994 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956ed194-df94-4b74-919f-9cdcfbdcf5a7-config-volume\") pod \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.639240 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxkdc\" (UniqueName: \"kubernetes.io/projected/956ed194-df94-4b74-919f-9cdcfbdcf5a7-kube-api-access-hxkdc\") pod \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.639316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956ed194-df94-4b74-919f-9cdcfbdcf5a7-secret-volume\") pod \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.640346 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/956ed194-df94-4b74-919f-9cdcfbdcf5a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "956ed194-df94-4b74-919f-9cdcfbdcf5a7" (UID: "956ed194-df94-4b74-919f-9cdcfbdcf5a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.646756 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956ed194-df94-4b74-919f-9cdcfbdcf5a7-kube-api-access-hxkdc" (OuterVolumeSpecName: "kube-api-access-hxkdc") pod "956ed194-df94-4b74-919f-9cdcfbdcf5a7" (UID: "956ed194-df94-4b74-919f-9cdcfbdcf5a7"). InnerVolumeSpecName "kube-api-access-hxkdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.648767 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/956ed194-df94-4b74-919f-9cdcfbdcf5a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "956ed194-df94-4b74-919f-9cdcfbdcf5a7" (UID: "956ed194-df94-4b74-919f-9cdcfbdcf5a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.741738 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxkdc\" (UniqueName: \"kubernetes.io/projected/956ed194-df94-4b74-919f-9cdcfbdcf5a7-kube-api-access-hxkdc\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.741778 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956ed194-df94-4b74-919f-9cdcfbdcf5a7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.741788 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956ed194-df94-4b74-919f-9cdcfbdcf5a7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:04 crc kubenswrapper[4778]: I0318 09:45:04.193825 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:45:04 crc kubenswrapper[4778]: E0318 09:45:04.194273 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:45:04 crc kubenswrapper[4778]: I0318 09:45:04.210353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" event={"ID":"956ed194-df94-4b74-919f-9cdcfbdcf5a7","Type":"ContainerDied","Data":"50d5642833b7d2fab625e9ecbe8af9feca4613d9050cdea2f10325e0597cb421"} Mar 18 09:45:04 crc kubenswrapper[4778]: I0318 09:45:04.210520 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d5642833b7d2fab625e9ecbe8af9feca4613d9050cdea2f10325e0597cb421" Mar 18 09:45:04 crc kubenswrapper[4778]: I0318 09:45:04.210390 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:04 crc kubenswrapper[4778]: I0318 09:45:04.268242 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl"] Mar 18 09:45:04 crc kubenswrapper[4778]: I0318 09:45:04.283266 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl"] Mar 18 09:45:06 crc kubenswrapper[4778]: I0318 09:45:06.197615 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ee6937-a1a5-42ea-a460-29d54478e633" path="/var/lib/kubelet/pods/97ee6937-a1a5-42ea-a460-29d54478e633/volumes" Mar 18 09:45:15 crc kubenswrapper[4778]: I0318 09:45:15.189076 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:45:15 crc kubenswrapper[4778]: E0318 09:45:15.190380 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:45:16 crc kubenswrapper[4778]: I0318 09:45:16.338402 4778 generic.go:334] "Generic (PLEG): container finished" podID="1f0f4177-ad12-4848-bbd7-39b004344cb3" containerID="c96da0fdc9f23d1c8174300e8944755e5546994203de0c9b38e19a45beb705b3" exitCode=0 Mar 18 09:45:16 crc kubenswrapper[4778]: I0318 09:45:16.338723 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" event={"ID":"1f0f4177-ad12-4848-bbd7-39b004344cb3","Type":"ContainerDied","Data":"c96da0fdc9f23d1c8174300e8944755e5546994203de0c9b38e19a45beb705b3"} Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.747441 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.824263 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovn-combined-ca-bundle\") pod \"1f0f4177-ad12-4848-bbd7-39b004344cb3\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.824316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ssh-key-openstack-edpm-ipam\") pod \"1f0f4177-ad12-4848-bbd7-39b004344cb3\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.824337 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8df\" (UniqueName: \"kubernetes.io/projected/1f0f4177-ad12-4848-bbd7-39b004344cb3-kube-api-access-fc8df\") pod \"1f0f4177-ad12-4848-bbd7-39b004344cb3\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.824401 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-inventory\") pod \"1f0f4177-ad12-4848-bbd7-39b004344cb3\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.824447 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovncontroller-config-0\") pod \"1f0f4177-ad12-4848-bbd7-39b004344cb3\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.824466 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ceph\") pod \"1f0f4177-ad12-4848-bbd7-39b004344cb3\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.832493 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1f0f4177-ad12-4848-bbd7-39b004344cb3" (UID: "1f0f4177-ad12-4848-bbd7-39b004344cb3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.832595 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0f4177-ad12-4848-bbd7-39b004344cb3-kube-api-access-fc8df" (OuterVolumeSpecName: "kube-api-access-fc8df") pod "1f0f4177-ad12-4848-bbd7-39b004344cb3" (UID: "1f0f4177-ad12-4848-bbd7-39b004344cb3"). InnerVolumeSpecName "kube-api-access-fc8df". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.832729 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ceph" (OuterVolumeSpecName: "ceph") pod "1f0f4177-ad12-4848-bbd7-39b004344cb3" (UID: "1f0f4177-ad12-4848-bbd7-39b004344cb3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.854141 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "1f0f4177-ad12-4848-bbd7-39b004344cb3" (UID: "1f0f4177-ad12-4848-bbd7-39b004344cb3"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.854291 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1f0f4177-ad12-4848-bbd7-39b004344cb3" (UID: "1f0f4177-ad12-4848-bbd7-39b004344cb3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.863416 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-inventory" (OuterVolumeSpecName: "inventory") pod "1f0f4177-ad12-4848-bbd7-39b004344cb3" (UID: "1f0f4177-ad12-4848-bbd7-39b004344cb3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.926613 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.926663 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.926675 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8df\" (UniqueName: \"kubernetes.io/projected/1f0f4177-ad12-4848-bbd7-39b004344cb3-kube-api-access-fc8df\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.926683 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.926692 4778 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.926703 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.359516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" event={"ID":"1f0f4177-ad12-4848-bbd7-39b004344cb3","Type":"ContainerDied","Data":"54b76851245c233c2784f282b1ca1eb9cfa025c851c32932417d057083ffca1c"} Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.359861 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54b76851245c233c2784f282b1ca1eb9cfa025c851c32932417d057083ffca1c" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.359657 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.527160 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v"] Mar 18 09:45:18 crc kubenswrapper[4778]: E0318 09:45:18.527531 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0f4177-ad12-4848-bbd7-39b004344cb3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.527548 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0f4177-ad12-4848-bbd7-39b004344cb3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 09:45:18 crc kubenswrapper[4778]: E0318 09:45:18.527584 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956ed194-df94-4b74-919f-9cdcfbdcf5a7" containerName="collect-profiles" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.527590 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="956ed194-df94-4b74-919f-9cdcfbdcf5a7" containerName="collect-profiles" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.527740 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="956ed194-df94-4b74-919f-9cdcfbdcf5a7" containerName="collect-profiles" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.527762 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0f4177-ad12-4848-bbd7-39b004344cb3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.528434 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.533448 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.533695 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.534347 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.534556 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.534714 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.534859 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.535019 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.538249 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v"] Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.639797 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.639921 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.640000 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.640028 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.640167 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.640245 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7p82\" (UniqueName: \"kubernetes.io/projected/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-kube-api-access-n7p82\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.640304 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742383 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742482 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742505 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742544 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742578 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7p82\" (UniqueName: \"kubernetes.io/projected/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-kube-api-access-n7p82\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742625 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.747756 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.748398 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.748405 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.748918 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.749315 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.750331 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.758342 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7p82\" (UniqueName: \"kubernetes.io/projected/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-kube-api-access-n7p82\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.845421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:19 crc kubenswrapper[4778]: I0318 09:45:19.428280 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v"] Mar 18 09:45:20 crc kubenswrapper[4778]: I0318 09:45:20.381424 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" event={"ID":"52250b90-fbc6-418e-9a5f-4873d5fa5cd0","Type":"ContainerStarted","Data":"ac5f02de690c8a4d5294091531625f1900cc40e366d4cb6654150b4c7eb35d5d"} Mar 18 09:45:20 crc kubenswrapper[4778]: I0318 09:45:20.382171 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" event={"ID":"52250b90-fbc6-418e-9a5f-4873d5fa5cd0","Type":"ContainerStarted","Data":"4209540ec7b376466182558df0c5b4d7f8fd041732dba8b257e8fb43f4388585"} Mar 18 09:45:20 crc kubenswrapper[4778]: I0318 09:45:20.403643 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" podStartSLOduration=1.85240798 podStartE2EDuration="2.403610971s" podCreationTimestamp="2026-03-18 09:45:18 +0000 UTC" firstStartedPulling="2026-03-18 09:45:19.4258564 +0000 UTC m=+2586.000601250" lastFinishedPulling="2026-03-18 09:45:19.977059391 +0000 UTC m=+2586.551804241" observedRunningTime="2026-03-18 09:45:20.398902343 +0000 UTC m=+2586.973647243" watchObservedRunningTime="2026-03-18 09:45:20.403610971 +0000 UTC m=+2586.978355821" Mar 18 09:45:30 crc kubenswrapper[4778]: I0318 09:45:30.187370 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:45:30 crc kubenswrapper[4778]: E0318 09:45:30.188491 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:45:32 crc kubenswrapper[4778]: I0318 09:45:32.445543 4778 scope.go:117] "RemoveContainer" containerID="f1aaa8a2c1f96baaee4b7353f353a9b567635ea9eb73df19ffa50153f00a757d" Mar 18 09:45:41 crc kubenswrapper[4778]: I0318 09:45:41.187706 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:45:41 crc kubenswrapper[4778]: E0318 09:45:41.188469 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:45:54 crc kubenswrapper[4778]: I0318 09:45:54.192538 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:45:54 crc kubenswrapper[4778]: E0318 09:45:54.193294 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.162798 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563786-wpjmv"] Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.164534 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.168054 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.168389 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.168467 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.176644 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563786-wpjmv"] Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.304839 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v7zf\" (UniqueName: \"kubernetes.io/projected/e81f72c3-90fb-4526-97e3-977f3dbd00b0-kube-api-access-7v7zf\") pod \"auto-csr-approver-29563786-wpjmv\" (UID: \"e81f72c3-90fb-4526-97e3-977f3dbd00b0\") " pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.407948 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v7zf\" (UniqueName: \"kubernetes.io/projected/e81f72c3-90fb-4526-97e3-977f3dbd00b0-kube-api-access-7v7zf\") pod \"auto-csr-approver-29563786-wpjmv\" (UID: \"e81f72c3-90fb-4526-97e3-977f3dbd00b0\") " pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.437061 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v7zf\" (UniqueName: \"kubernetes.io/projected/e81f72c3-90fb-4526-97e3-977f3dbd00b0-kube-api-access-7v7zf\") pod \"auto-csr-approver-29563786-wpjmv\" (UID: \"e81f72c3-90fb-4526-97e3-977f3dbd00b0\") " pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.497326 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:01 crc kubenswrapper[4778]: I0318 09:46:01.033655 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563786-wpjmv"] Mar 18 09:46:01 crc kubenswrapper[4778]: I0318 09:46:01.772405 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" event={"ID":"e81f72c3-90fb-4526-97e3-977f3dbd00b0","Type":"ContainerStarted","Data":"ed3b9b80a3eaff39754a6a0e4347277ee13e77779ab27abb2a449815d81e2808"} Mar 18 09:46:02 crc kubenswrapper[4778]: I0318 09:46:02.783714 4778 generic.go:334] "Generic (PLEG): container finished" podID="e81f72c3-90fb-4526-97e3-977f3dbd00b0" containerID="f0336ddfd7a0dbb015d37a5f5151d0bd63e8c2d9a92eb6c0cfc48a0cb9420252" exitCode=0 Mar 18 09:46:02 crc kubenswrapper[4778]: I0318 09:46:02.783774 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" event={"ID":"e81f72c3-90fb-4526-97e3-977f3dbd00b0","Type":"ContainerDied","Data":"f0336ddfd7a0dbb015d37a5f5151d0bd63e8c2d9a92eb6c0cfc48a0cb9420252"} Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.092929 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.278895 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v7zf\" (UniqueName: \"kubernetes.io/projected/e81f72c3-90fb-4526-97e3-977f3dbd00b0-kube-api-access-7v7zf\") pod \"e81f72c3-90fb-4526-97e3-977f3dbd00b0\" (UID: \"e81f72c3-90fb-4526-97e3-977f3dbd00b0\") " Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.287062 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81f72c3-90fb-4526-97e3-977f3dbd00b0-kube-api-access-7v7zf" (OuterVolumeSpecName: "kube-api-access-7v7zf") pod "e81f72c3-90fb-4526-97e3-977f3dbd00b0" (UID: "e81f72c3-90fb-4526-97e3-977f3dbd00b0"). InnerVolumeSpecName "kube-api-access-7v7zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.382212 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v7zf\" (UniqueName: \"kubernetes.io/projected/e81f72c3-90fb-4526-97e3-977f3dbd00b0-kube-api-access-7v7zf\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.803604 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" event={"ID":"e81f72c3-90fb-4526-97e3-977f3dbd00b0","Type":"ContainerDied","Data":"ed3b9b80a3eaff39754a6a0e4347277ee13e77779ab27abb2a449815d81e2808"} Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.804055 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed3b9b80a3eaff39754a6a0e4347277ee13e77779ab27abb2a449815d81e2808" Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.804140 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:05 crc kubenswrapper[4778]: I0318 09:46:05.171740 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563780-vgggq"] Mar 18 09:46:05 crc kubenswrapper[4778]: I0318 09:46:05.181094 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563780-vgggq"] Mar 18 09:46:06 crc kubenswrapper[4778]: I0318 09:46:06.198130 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed393452-0d17-4c60-b37b-544b21c09da1" path="/var/lib/kubelet/pods/ed393452-0d17-4c60-b37b-544b21c09da1/volumes" Mar 18 09:46:08 crc kubenswrapper[4778]: I0318 09:46:08.187822 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:46:08 crc kubenswrapper[4778]: E0318 09:46:08.188259 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:46:14 crc kubenswrapper[4778]: I0318 09:46:14.893874 4778 generic.go:334] "Generic (PLEG): container finished" podID="52250b90-fbc6-418e-9a5f-4873d5fa5cd0" containerID="ac5f02de690c8a4d5294091531625f1900cc40e366d4cb6654150b4c7eb35d5d" exitCode=0 Mar 18 09:46:14 crc kubenswrapper[4778]: I0318 09:46:14.894094 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" event={"ID":"52250b90-fbc6-418e-9a5f-4873d5fa5cd0","Type":"ContainerDied","Data":"ac5f02de690c8a4d5294091531625f1900cc40e366d4cb6654150b4c7eb35d5d"} Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.378459 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.520592 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ssh-key-openstack-edpm-ipam\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.520983 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-nova-metadata-neutron-config-0\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.521099 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.521141 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ceph\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.521248 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7p82\" (UniqueName: \"kubernetes.io/projected/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-kube-api-access-n7p82\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.521315 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-inventory\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.521391 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-metadata-combined-ca-bundle\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.526544 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.526689 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-kube-api-access-n7p82" (OuterVolumeSpecName: "kube-api-access-n7p82") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "kube-api-access-n7p82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.527991 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ceph" (OuterVolumeSpecName: "ceph") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.554457 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-inventory" (OuterVolumeSpecName: "inventory") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.556045 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.560636 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.564908 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.624559 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7p82\" (UniqueName: \"kubernetes.io/projected/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-kube-api-access-n7p82\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.624813 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.624966 4778 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.625095 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.625461 4778 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.625585 4778 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.625751 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.922055 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.922321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" event={"ID":"52250b90-fbc6-418e-9a5f-4873d5fa5cd0","Type":"ContainerDied","Data":"4209540ec7b376466182558df0c5b4d7f8fd041732dba8b257e8fb43f4388585"} Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.922359 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4209540ec7b376466182558df0c5b4d7f8fd041732dba8b257e8fb43f4388585" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.036751 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr"] Mar 18 09:46:17 crc kubenswrapper[4778]: E0318 09:46:17.037109 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81f72c3-90fb-4526-97e3-977f3dbd00b0" containerName="oc" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.037125 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81f72c3-90fb-4526-97e3-977f3dbd00b0" containerName="oc" Mar 18 09:46:17 crc kubenswrapper[4778]: E0318 09:46:17.037140 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52250b90-fbc6-418e-9a5f-4873d5fa5cd0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.037147 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52250b90-fbc6-418e-9a5f-4873d5fa5cd0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.037311 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81f72c3-90fb-4526-97e3-977f3dbd00b0" containerName="oc" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.037338 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="52250b90-fbc6-418e-9a5f-4873d5fa5cd0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.037860 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.040354 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.040502 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.040569 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.040766 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.040768 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.041680 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.059276 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr"] Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.235382 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqtc4\" (UniqueName: \"kubernetes.io/projected/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-kube-api-access-nqtc4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.235444 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.235514 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.235553 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.235580 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.235602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.336996 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.337051 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.337356 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqtc4\" (UniqueName: \"kubernetes.io/projected/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-kube-api-access-nqtc4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.337415 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.337527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.337613 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.340797 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.341090 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.342125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.346744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.347438 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.362544 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqtc4\" (UniqueName: \"kubernetes.io/projected/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-kube-api-access-nqtc4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.659840 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:18 crc kubenswrapper[4778]: I0318 09:46:18.214056 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr"] Mar 18 09:46:18 crc kubenswrapper[4778]: I0318 09:46:18.940090 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" event={"ID":"d50b5540-c2ca-4889-bbb0-3b5d04bc602f","Type":"ContainerStarted","Data":"2ab6059febfad7a9f9307837f23ed27bc599c3e1e1aefffb4f2d067fd81fc840"} Mar 18 09:46:19 crc kubenswrapper[4778]: I0318 09:46:19.948560 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" event={"ID":"d50b5540-c2ca-4889-bbb0-3b5d04bc602f","Type":"ContainerStarted","Data":"3d07267fc8bce82aa6c1c143fb1b7b931cfc127a5bd399a0e485c50a9cb33804"} Mar 18 09:46:19 crc kubenswrapper[4778]: I0318 09:46:19.965456 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" podStartSLOduration=2.3635334820000002 podStartE2EDuration="2.965435961s" podCreationTimestamp="2026-03-18 09:46:17 +0000 UTC" firstStartedPulling="2026-03-18 09:46:18.216839225 +0000 UTC m=+2644.791584065" lastFinishedPulling="2026-03-18 09:46:18.818741674 +0000 UTC m=+2645.393486544" observedRunningTime="2026-03-18 09:46:19.963652862 +0000 UTC m=+2646.538397722" watchObservedRunningTime="2026-03-18 09:46:19.965435961 +0000 UTC m=+2646.540180811" Mar 18 09:46:20 crc kubenswrapper[4778]: I0318 09:46:20.187633 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:46:20 crc kubenswrapper[4778]: E0318 09:46:20.187901 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:46:32 crc kubenswrapper[4778]: I0318 09:46:32.559869 4778 scope.go:117] "RemoveContainer" containerID="9f45f4032f3621f6cd43ea95d13369122ace0eb37b6189c6643a14332da3a74a" Mar 18 09:46:35 crc kubenswrapper[4778]: I0318 09:46:35.188322 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:46:36 crc kubenswrapper[4778]: I0318 09:46:36.140897 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"5df8914ca5e61db32f1727722fb361804c23982bfeee84574751dabf5cd01a2d"} Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.149064 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563788-pctk8"] Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.150806 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.153058 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.153478 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.153769 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.168355 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563788-pctk8"] Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.196418 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xglt8\" (UniqueName: \"kubernetes.io/projected/dc64d6e3-ed19-4365-ab83-8c1af026054b-kube-api-access-xglt8\") pod \"auto-csr-approver-29563788-pctk8\" (UID: \"dc64d6e3-ed19-4365-ab83-8c1af026054b\") " pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.299325 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xglt8\" (UniqueName: \"kubernetes.io/projected/dc64d6e3-ed19-4365-ab83-8c1af026054b-kube-api-access-xglt8\") pod \"auto-csr-approver-29563788-pctk8\" (UID: \"dc64d6e3-ed19-4365-ab83-8c1af026054b\") " pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.324212 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xglt8\" (UniqueName: \"kubernetes.io/projected/dc64d6e3-ed19-4365-ab83-8c1af026054b-kube-api-access-xglt8\") pod \"auto-csr-approver-29563788-pctk8\" (UID: \"dc64d6e3-ed19-4365-ab83-8c1af026054b\") " pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.470772 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.913656 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563788-pctk8"] Mar 18 09:48:01 crc kubenswrapper[4778]: I0318 09:48:01.033616 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563788-pctk8" event={"ID":"dc64d6e3-ed19-4365-ab83-8c1af026054b","Type":"ContainerStarted","Data":"e7b1e361c7d1374a9d9a7e89faa1a51a3b4b938d4325f0bf8d9fa2f63d5656a6"} Mar 18 09:48:03 crc kubenswrapper[4778]: I0318 09:48:03.061744 4778 generic.go:334] "Generic (PLEG): container finished" podID="dc64d6e3-ed19-4365-ab83-8c1af026054b" containerID="12a11cfc29f7b65306c1684f9c90110c5f5f19bee2195c78cf1dbf6c7f4120dd" exitCode=0 Mar 18 09:48:03 crc kubenswrapper[4778]: I0318 09:48:03.061855 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563788-pctk8" event={"ID":"dc64d6e3-ed19-4365-ab83-8c1af026054b","Type":"ContainerDied","Data":"12a11cfc29f7b65306c1684f9c90110c5f5f19bee2195c78cf1dbf6c7f4120dd"} Mar 18 09:48:04 crc kubenswrapper[4778]: I0318 09:48:04.395313 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:04 crc kubenswrapper[4778]: I0318 09:48:04.491733 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xglt8\" (UniqueName: \"kubernetes.io/projected/dc64d6e3-ed19-4365-ab83-8c1af026054b-kube-api-access-xglt8\") pod \"dc64d6e3-ed19-4365-ab83-8c1af026054b\" (UID: \"dc64d6e3-ed19-4365-ab83-8c1af026054b\") " Mar 18 09:48:04 crc kubenswrapper[4778]: I0318 09:48:04.501345 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc64d6e3-ed19-4365-ab83-8c1af026054b-kube-api-access-xglt8" (OuterVolumeSpecName: "kube-api-access-xglt8") pod "dc64d6e3-ed19-4365-ab83-8c1af026054b" (UID: "dc64d6e3-ed19-4365-ab83-8c1af026054b"). InnerVolumeSpecName "kube-api-access-xglt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:48:04 crc kubenswrapper[4778]: I0318 09:48:04.595782 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xglt8\" (UniqueName: \"kubernetes.io/projected/dc64d6e3-ed19-4365-ab83-8c1af026054b-kube-api-access-xglt8\") on node \"crc\" DevicePath \"\"" Mar 18 09:48:05 crc kubenswrapper[4778]: I0318 09:48:05.086638 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563788-pctk8" event={"ID":"dc64d6e3-ed19-4365-ab83-8c1af026054b","Type":"ContainerDied","Data":"e7b1e361c7d1374a9d9a7e89faa1a51a3b4b938d4325f0bf8d9fa2f63d5656a6"} Mar 18 09:48:05 crc kubenswrapper[4778]: I0318 09:48:05.086705 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7b1e361c7d1374a9d9a7e89faa1a51a3b4b938d4325f0bf8d9fa2f63d5656a6" Mar 18 09:48:05 crc kubenswrapper[4778]: I0318 09:48:05.086787 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:05 crc kubenswrapper[4778]: I0318 09:48:05.486722 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563782-zn5kg"] Mar 18 09:48:05 crc kubenswrapper[4778]: I0318 09:48:05.493884 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563782-zn5kg"] Mar 18 09:48:06 crc kubenswrapper[4778]: I0318 09:48:06.209720 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3895116-2d67-4e3c-9f3e-e04d3cfe0518" path="/var/lib/kubelet/pods/d3895116-2d67-4e3c-9f3e-e04d3cfe0518/volumes" Mar 18 09:48:32 crc kubenswrapper[4778]: I0318 09:48:32.646717 4778 scope.go:117] "RemoveContainer" containerID="c1ff920321931ae33985b21e2caf3b4db031e28a61271d5d1f2c36e681d955e6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.089564 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jztl6"] Mar 18 09:48:55 crc kubenswrapper[4778]: E0318 09:48:55.091395 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc64d6e3-ed19-4365-ab83-8c1af026054b" containerName="oc" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.091434 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc64d6e3-ed19-4365-ab83-8c1af026054b" containerName="oc" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.091970 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc64d6e3-ed19-4365-ab83-8c1af026054b" containerName="oc" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.095111 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.117324 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jztl6"] Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.199274 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65mrl\" (UniqueName: \"kubernetes.io/projected/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-kube-api-access-65mrl\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.199318 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-utilities\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.199446 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-catalog-content\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.301159 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-catalog-content\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.301362 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65mrl\" (UniqueName: \"kubernetes.io/projected/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-kube-api-access-65mrl\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.301408 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-utilities\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.302128 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-utilities\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.302314 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-catalog-content\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.321181 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65mrl\" (UniqueName: \"kubernetes.io/projected/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-kube-api-access-65mrl\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.420177 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.973311 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jztl6"] Mar 18 09:48:56 crc kubenswrapper[4778]: I0318 09:48:56.576602 4778 generic.go:334] "Generic (PLEG): container finished" podID="0c0dfa2e-b334-4eed-9e2f-3097f2b5102a" containerID="406d268147ccbd6200b43ebee4807fc73359494cf62045cc84dfad45af9131fc" exitCode=0 Mar 18 09:48:56 crc kubenswrapper[4778]: I0318 09:48:56.576692 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jztl6" event={"ID":"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a","Type":"ContainerDied","Data":"406d268147ccbd6200b43ebee4807fc73359494cf62045cc84dfad45af9131fc"} Mar 18 09:48:56 crc kubenswrapper[4778]: I0318 09:48:56.578297 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jztl6" event={"ID":"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a","Type":"ContainerStarted","Data":"635a623f677257671dc346299ad868d6d659f6256d11fb27418a60853c67216b"} Mar 18 09:48:56 crc kubenswrapper[4778]: I0318 09:48:56.579917 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:49:00 crc kubenswrapper[4778]: I0318 09:49:00.147284 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:49:00 crc kubenswrapper[4778]: I0318 09:49:00.147913 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:49:00 crc kubenswrapper[4778]: I0318 09:49:00.625530 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jztl6" event={"ID":"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a","Type":"ContainerStarted","Data":"f540e4e75bb9d7a85bc9cc86ce188873a4bcab0bee0b8f301e74cc3af623bb8b"} Mar 18 09:49:01 crc kubenswrapper[4778]: I0318 09:49:01.637923 4778 generic.go:334] "Generic (PLEG): container finished" podID="0c0dfa2e-b334-4eed-9e2f-3097f2b5102a" containerID="f540e4e75bb9d7a85bc9cc86ce188873a4bcab0bee0b8f301e74cc3af623bb8b" exitCode=0 Mar 18 09:49:01 crc kubenswrapper[4778]: I0318 09:49:01.637992 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jztl6" event={"ID":"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a","Type":"ContainerDied","Data":"f540e4e75bb9d7a85bc9cc86ce188873a4bcab0bee0b8f301e74cc3af623bb8b"} Mar 18 09:49:02 crc kubenswrapper[4778]: I0318 09:49:02.650189 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jztl6" event={"ID":"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a","Type":"ContainerStarted","Data":"3fa2561048c07e212e89fb360838163147894a07830056afc658fa4ddede2620"} Mar 18 09:49:02 crc kubenswrapper[4778]: I0318 09:49:02.669181 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jztl6" podStartSLOduration=2.111380907 podStartE2EDuration="7.669161489s" podCreationTimestamp="2026-03-18 09:48:55 +0000 UTC" firstStartedPulling="2026-03-18 09:48:56.579542559 +0000 UTC m=+2803.154287429" lastFinishedPulling="2026-03-18 09:49:02.137323171 +0000 UTC m=+2808.712068011" observedRunningTime="2026-03-18 09:49:02.667916625 +0000 UTC m=+2809.242661535" watchObservedRunningTime="2026-03-18 09:49:02.669161489 +0000 UTC m=+2809.243906329" Mar 18 09:49:05 crc kubenswrapper[4778]: I0318 09:49:05.422386 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:49:05 crc kubenswrapper[4778]: I0318 09:49:05.422838 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:49:05 crc kubenswrapper[4778]: I0318 09:49:05.465998 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:49:15 crc kubenswrapper[4778]: I0318 09:49:15.469448 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:49:15 crc kubenswrapper[4778]: I0318 09:49:15.547777 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jztl6"] Mar 18 09:49:15 crc kubenswrapper[4778]: I0318 09:49:15.619806 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ktcxn"] Mar 18 09:49:15 crc kubenswrapper[4778]: I0318 09:49:15.620117 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ktcxn" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="registry-server" containerID="cri-o://0d5497da92a3a6b067e66da6b34d1f0c05c8fc0ce853a92ab83f4966e6f9359e" gracePeriod=2 Mar 18 09:49:15 crc kubenswrapper[4778]: I0318 09:49:15.781658 4778 generic.go:334] "Generic (PLEG): container finished" podID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerID="0d5497da92a3a6b067e66da6b34d1f0c05c8fc0ce853a92ab83f4966e6f9359e" exitCode=0 Mar 18 09:49:15 crc kubenswrapper[4778]: I0318 09:49:15.781751 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktcxn" event={"ID":"fee87709-f8ed-4eb4-829e-1fdb6534bb35","Type":"ContainerDied","Data":"0d5497da92a3a6b067e66da6b34d1f0c05c8fc0ce853a92ab83f4966e6f9359e"} Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.101422 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.191862 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-utilities\") pod \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.192012 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s9q2\" (UniqueName: \"kubernetes.io/projected/fee87709-f8ed-4eb4-829e-1fdb6534bb35-kube-api-access-5s9q2\") pod \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.192069 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-catalog-content\") pod \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.192541 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-utilities" (OuterVolumeSpecName: "utilities") pod "fee87709-f8ed-4eb4-829e-1fdb6534bb35" (UID: "fee87709-f8ed-4eb4-829e-1fdb6534bb35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.199372 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee87709-f8ed-4eb4-829e-1fdb6534bb35-kube-api-access-5s9q2" (OuterVolumeSpecName: "kube-api-access-5s9q2") pod "fee87709-f8ed-4eb4-829e-1fdb6534bb35" (UID: "fee87709-f8ed-4eb4-829e-1fdb6534bb35"). InnerVolumeSpecName "kube-api-access-5s9q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.245854 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fee87709-f8ed-4eb4-829e-1fdb6534bb35" (UID: "fee87709-f8ed-4eb4-829e-1fdb6534bb35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.294510 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s9q2\" (UniqueName: \"kubernetes.io/projected/fee87709-f8ed-4eb4-829e-1fdb6534bb35-kube-api-access-5s9q2\") on node \"crc\" DevicePath \"\"" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.294539 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.294924 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.791208 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktcxn" event={"ID":"fee87709-f8ed-4eb4-829e-1fdb6534bb35","Type":"ContainerDied","Data":"1a86626e5d4576d55c9f62a59074b0761782b73c69b053e7829507e68652f471"} Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.791267 4778 scope.go:117] "RemoveContainer" containerID="0d5497da92a3a6b067e66da6b34d1f0c05c8fc0ce853a92ab83f4966e6f9359e" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.791359 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.821768 4778 scope.go:117] "RemoveContainer" containerID="d37142aca8df005734457524dffa32c4483716edffbcfb2d1b92b3701d6e7e1c" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.825313 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ktcxn"] Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.833475 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ktcxn"] Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.855327 4778 scope.go:117] "RemoveContainer" containerID="6ecbe80389c09da7c5dfaf24f572df1adb64cba289f74a3e8339845f8cebe749" Mar 18 09:49:18 crc kubenswrapper[4778]: I0318 09:49:18.195968 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" path="/var/lib/kubelet/pods/fee87709-f8ed-4eb4-829e-1fdb6534bb35/volumes" Mar 18 09:49:30 crc kubenswrapper[4778]: I0318 09:49:30.147294 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:49:30 crc kubenswrapper[4778]: I0318 09:49:30.147911 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.148306 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.149153 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.149548 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.150409 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5df8914ca5e61db32f1727722fb361804c23982bfeee84574751dabf5cd01a2d"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.150486 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://5df8914ca5e61db32f1727722fb361804c23982bfeee84574751dabf5cd01a2d" gracePeriod=600 Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.159816 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563790-nsf4n"] Mar 18 09:50:00 crc kubenswrapper[4778]: E0318 09:50:00.160385 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="extract-utilities" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.160454 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="extract-utilities" Mar 18 09:50:00 crc kubenswrapper[4778]: E0318 09:50:00.160492 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="extract-content" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.160504 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="extract-content" Mar 18 09:50:00 crc kubenswrapper[4778]: E0318 09:50:00.160544 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="registry-server" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.160556 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="registry-server" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.160847 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="registry-server" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.161687 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.164024 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.164402 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.164548 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.178616 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563790-nsf4n"] Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.341383 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp68l\" (UniqueName: \"kubernetes.io/projected/08fbf495-18e2-4d61-ad96-1bf74db07f0e-kube-api-access-wp68l\") pod \"auto-csr-approver-29563790-nsf4n\" (UID: \"08fbf495-18e2-4d61-ad96-1bf74db07f0e\") " pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.444396 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp68l\" (UniqueName: \"kubernetes.io/projected/08fbf495-18e2-4d61-ad96-1bf74db07f0e-kube-api-access-wp68l\") pod \"auto-csr-approver-29563790-nsf4n\" (UID: \"08fbf495-18e2-4d61-ad96-1bf74db07f0e\") " pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.468834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp68l\" (UniqueName: \"kubernetes.io/projected/08fbf495-18e2-4d61-ad96-1bf74db07f0e-kube-api-access-wp68l\") pod \"auto-csr-approver-29563790-nsf4n\" (UID: \"08fbf495-18e2-4d61-ad96-1bf74db07f0e\") " pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.518853 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.982175 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563790-nsf4n"] Mar 18 09:50:00 crc kubenswrapper[4778]: W0318 09:50:00.984782 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08fbf495_18e2_4d61_ad96_1bf74db07f0e.slice/crio-fdd2c59cae4643aef1ce354ee05143c0cd4df887daa07d1b649bf9758d75f050 WatchSource:0}: Error finding container fdd2c59cae4643aef1ce354ee05143c0cd4df887daa07d1b649bf9758d75f050: Status 404 returned error can't find the container with id fdd2c59cae4643aef1ce354ee05143c0cd4df887daa07d1b649bf9758d75f050 Mar 18 09:50:01 crc kubenswrapper[4778]: I0318 09:50:01.200217 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" event={"ID":"08fbf495-18e2-4d61-ad96-1bf74db07f0e","Type":"ContainerStarted","Data":"fdd2c59cae4643aef1ce354ee05143c0cd4df887daa07d1b649bf9758d75f050"} Mar 18 09:50:01 crc kubenswrapper[4778]: I0318 09:50:01.203376 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="5df8914ca5e61db32f1727722fb361804c23982bfeee84574751dabf5cd01a2d" exitCode=0 Mar 18 09:50:01 crc kubenswrapper[4778]: I0318 09:50:01.203392 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"5df8914ca5e61db32f1727722fb361804c23982bfeee84574751dabf5cd01a2d"} Mar 18 09:50:01 crc kubenswrapper[4778]: I0318 09:50:01.203421 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3"} Mar 18 09:50:01 crc kubenswrapper[4778]: I0318 09:50:01.203439 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:50:03 crc kubenswrapper[4778]: I0318 09:50:03.239616 4778 generic.go:334] "Generic (PLEG): container finished" podID="08fbf495-18e2-4d61-ad96-1bf74db07f0e" containerID="58ef47c1a33dc103d35c1381547dc4531f738d5df6648d3b82a9b2e034b9599e" exitCode=0 Mar 18 09:50:03 crc kubenswrapper[4778]: I0318 09:50:03.239718 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" event={"ID":"08fbf495-18e2-4d61-ad96-1bf74db07f0e","Type":"ContainerDied","Data":"58ef47c1a33dc103d35c1381547dc4531f738d5df6648d3b82a9b2e034b9599e"} Mar 18 09:50:04 crc kubenswrapper[4778]: I0318 09:50:04.633371 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:04 crc kubenswrapper[4778]: I0318 09:50:04.827977 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp68l\" (UniqueName: \"kubernetes.io/projected/08fbf495-18e2-4d61-ad96-1bf74db07f0e-kube-api-access-wp68l\") pod \"08fbf495-18e2-4d61-ad96-1bf74db07f0e\" (UID: \"08fbf495-18e2-4d61-ad96-1bf74db07f0e\") " Mar 18 09:50:04 crc kubenswrapper[4778]: I0318 09:50:04.833797 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08fbf495-18e2-4d61-ad96-1bf74db07f0e-kube-api-access-wp68l" (OuterVolumeSpecName: "kube-api-access-wp68l") pod "08fbf495-18e2-4d61-ad96-1bf74db07f0e" (UID: "08fbf495-18e2-4d61-ad96-1bf74db07f0e"). InnerVolumeSpecName "kube-api-access-wp68l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:50:04 crc kubenswrapper[4778]: I0318 09:50:04.931498 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp68l\" (UniqueName: \"kubernetes.io/projected/08fbf495-18e2-4d61-ad96-1bf74db07f0e-kube-api-access-wp68l\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:05 crc kubenswrapper[4778]: I0318 09:50:05.257304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" event={"ID":"08fbf495-18e2-4d61-ad96-1bf74db07f0e","Type":"ContainerDied","Data":"fdd2c59cae4643aef1ce354ee05143c0cd4df887daa07d1b649bf9758d75f050"} Mar 18 09:50:05 crc kubenswrapper[4778]: I0318 09:50:05.257820 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd2c59cae4643aef1ce354ee05143c0cd4df887daa07d1b649bf9758d75f050" Mar 18 09:50:05 crc kubenswrapper[4778]: I0318 09:50:05.257355 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:05 crc kubenswrapper[4778]: I0318 09:50:05.710696 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563784-pdds9"] Mar 18 09:50:05 crc kubenswrapper[4778]: I0318 09:50:05.717500 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563784-pdds9"] Mar 18 09:50:06 crc kubenswrapper[4778]: I0318 09:50:06.196873 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab4f60ce-be48-4052-9fa7-905b70e65c3a" path="/var/lib/kubelet/pods/ab4f60ce-be48-4052-9fa7-905b70e65c3a/volumes" Mar 18 09:50:20 crc kubenswrapper[4778]: I0318 09:50:20.375474 4778 generic.go:334] "Generic (PLEG): container finished" podID="d50b5540-c2ca-4889-bbb0-3b5d04bc602f" containerID="3d07267fc8bce82aa6c1c143fb1b7b931cfc127a5bd399a0e485c50a9cb33804" exitCode=0 Mar 18 09:50:20 crc kubenswrapper[4778]: I0318 09:50:20.375554 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" event={"ID":"d50b5540-c2ca-4889-bbb0-3b5d04bc602f","Type":"ContainerDied","Data":"3d07267fc8bce82aa6c1c143fb1b7b931cfc127a5bd399a0e485c50a9cb33804"} Mar 18 09:50:21 crc kubenswrapper[4778]: I0318 09:50:21.925022 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.033421 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ssh-key-openstack-edpm-ipam\") pod \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.033618 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-combined-ca-bundle\") pod \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.033676 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqtc4\" (UniqueName: \"kubernetes.io/projected/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-kube-api-access-nqtc4\") pod \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.033699 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-inventory\") pod \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.033745 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-secret-0\") pod \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.033777 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ceph\") pod \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.039759 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d50b5540-c2ca-4889-bbb0-3b5d04bc602f" (UID: "d50b5540-c2ca-4889-bbb0-3b5d04bc602f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.040500 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ceph" (OuterVolumeSpecName: "ceph") pod "d50b5540-c2ca-4889-bbb0-3b5d04bc602f" (UID: "d50b5540-c2ca-4889-bbb0-3b5d04bc602f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.055990 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-kube-api-access-nqtc4" (OuterVolumeSpecName: "kube-api-access-nqtc4") pod "d50b5540-c2ca-4889-bbb0-3b5d04bc602f" (UID: "d50b5540-c2ca-4889-bbb0-3b5d04bc602f"). InnerVolumeSpecName "kube-api-access-nqtc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.064175 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "d50b5540-c2ca-4889-bbb0-3b5d04bc602f" (UID: "d50b5540-c2ca-4889-bbb0-3b5d04bc602f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.064467 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d50b5540-c2ca-4889-bbb0-3b5d04bc602f" (UID: "d50b5540-c2ca-4889-bbb0-3b5d04bc602f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.069784 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-inventory" (OuterVolumeSpecName: "inventory") pod "d50b5540-c2ca-4889-bbb0-3b5d04bc602f" (UID: "d50b5540-c2ca-4889-bbb0-3b5d04bc602f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.135595 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.135775 4778 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.135857 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqtc4\" (UniqueName: \"kubernetes.io/projected/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-kube-api-access-nqtc4\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.135966 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.136071 4778 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.136172 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.405388 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" event={"ID":"d50b5540-c2ca-4889-bbb0-3b5d04bc602f","Type":"ContainerDied","Data":"2ab6059febfad7a9f9307837f23ed27bc599c3e1e1aefffb4f2d067fd81fc840"} Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.405732 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.405765 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ab6059febfad7a9f9307837f23ed27bc599c3e1e1aefffb4f2d067fd81fc840" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.518794 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9"] Mar 18 09:50:22 crc kubenswrapper[4778]: E0318 09:50:22.519235 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50b5540-c2ca-4889-bbb0-3b5d04bc602f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.519257 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50b5540-c2ca-4889-bbb0-3b5d04bc602f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 09:50:22 crc kubenswrapper[4778]: E0318 09:50:22.519293 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fbf495-18e2-4d61-ad96-1bf74db07f0e" containerName="oc" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.519302 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fbf495-18e2-4d61-ad96-1bf74db07f0e" containerName="oc" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.519503 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fbf495-18e2-4d61-ad96-1bf74db07f0e" containerName="oc" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.519530 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d50b5540-c2ca-4889-bbb0-3b5d04bc602f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.520233 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535038 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535132 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535159 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535050 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535177 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535376 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535448 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535658 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535673 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.536833 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9"] Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.543482 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.543721 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6zxg\" (UniqueName: \"kubernetes.io/projected/b2db5491-57b4-427a-b306-5e525a1e7c27-kube-api-access-z6zxg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.543822 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.543909 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.543975 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544051 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544140 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544257 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544394 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544613 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544683 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544804 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544859 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.645947 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646170 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646306 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646383 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646452 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646530 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646591 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6zxg\" (UniqueName: \"kubernetes.io/projected/b2db5491-57b4-427a-b306-5e525a1e7c27-kube-api-access-z6zxg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646673 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646748 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646813 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646881 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646960 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.647077 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.647785 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.648229 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.650506 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.651499 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.652484 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.653012 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.653091 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.653349 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.654102 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.654691 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.656849 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.657689 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.671703 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6zxg\" (UniqueName: \"kubernetes.io/projected/b2db5491-57b4-427a-b306-5e525a1e7c27-kube-api-access-z6zxg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.835807 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:23 crc kubenswrapper[4778]: I0318 09:50:23.359823 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9"] Mar 18 09:50:23 crc kubenswrapper[4778]: I0318 09:50:23.414358 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" event={"ID":"b2db5491-57b4-427a-b306-5e525a1e7c27","Type":"ContainerStarted","Data":"ae914e239ff54eca2bb96c1bbf0bed7d47de287f780b43c281d7c1dcccb9c71c"} Mar 18 09:50:24 crc kubenswrapper[4778]: I0318 09:50:24.422058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" event={"ID":"b2db5491-57b4-427a-b306-5e525a1e7c27","Type":"ContainerStarted","Data":"3b076b40e8f07ea23b1427f17330f5415549c41b6c6f9192b8ef848601e5be2b"} Mar 18 09:50:24 crc kubenswrapper[4778]: I0318 09:50:24.453863 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" podStartSLOduration=1.998170894 podStartE2EDuration="2.453841033s" podCreationTimestamp="2026-03-18 09:50:22 +0000 UTC" firstStartedPulling="2026-03-18 09:50:23.365328332 +0000 UTC m=+2889.940073182" lastFinishedPulling="2026-03-18 09:50:23.820998481 +0000 UTC m=+2890.395743321" observedRunningTime="2026-03-18 09:50:24.444526659 +0000 UTC m=+2891.019271509" watchObservedRunningTime="2026-03-18 09:50:24.453841033 +0000 UTC m=+2891.028585913" Mar 18 09:50:32 crc kubenswrapper[4778]: I0318 09:50:32.758466 4778 scope.go:117] "RemoveContainer" containerID="8f06326ea2269b974cec86640f654b1a9c686fe2bd62106254f2a194a59e658e" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.681422 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fwdkk"] Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.683861 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.695634 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwdkk"] Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.744506 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26d47\" (UniqueName: \"kubernetes.io/projected/a91e5adc-fb5f-44af-9f4e-43c57ecece37-kube-api-access-26d47\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.744698 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-utilities\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.744743 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-catalog-content\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.846787 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26d47\" (UniqueName: \"kubernetes.io/projected/a91e5adc-fb5f-44af-9f4e-43c57ecece37-kube-api-access-26d47\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.846887 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-utilities\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.846920 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-catalog-content\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.847449 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-utilities\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.847482 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-catalog-content\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.866540 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26d47\" (UniqueName: \"kubernetes.io/projected/a91e5adc-fb5f-44af-9f4e-43c57ecece37-kube-api-access-26d47\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.885852 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4xxv6"] Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.887978 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.901092 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xxv6"] Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.948982 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgc6j\" (UniqueName: \"kubernetes.io/projected/c4b995fc-abe8-41af-9287-6381d6a3f37e-kube-api-access-xgc6j\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.950512 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-utilities\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.950723 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-catalog-content\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.011917 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.052475 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-utilities\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.052585 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-catalog-content\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.052715 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgc6j\" (UniqueName: \"kubernetes.io/projected/c4b995fc-abe8-41af-9287-6381d6a3f37e-kube-api-access-xgc6j\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.053317 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-catalog-content\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.053323 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-utilities\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.071023 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgc6j\" (UniqueName: \"kubernetes.io/projected/c4b995fc-abe8-41af-9287-6381d6a3f37e-kube-api-access-xgc6j\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.236427 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.542870 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwdkk"] Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.740094 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xxv6"] Mar 18 09:51:47 crc kubenswrapper[4778]: I0318 09:51:47.224739 4778 generic.go:334] "Generic (PLEG): container finished" podID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerID="2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454" exitCode=0 Mar 18 09:51:47 crc kubenswrapper[4778]: I0318 09:51:47.224864 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerDied","Data":"2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454"} Mar 18 09:51:47 crc kubenswrapper[4778]: I0318 09:51:47.224928 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerStarted","Data":"89f244671c9af7d58f97b9b32aa5636eb413ce69e26ae52cac5094e2618def32"} Mar 18 09:51:47 crc kubenswrapper[4778]: I0318 09:51:47.228067 4778 generic.go:334] "Generic (PLEG): container finished" podID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerID="63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d" exitCode=0 Mar 18 09:51:47 crc kubenswrapper[4778]: I0318 09:51:47.228100 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerDied","Data":"63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d"} Mar 18 09:51:47 crc kubenswrapper[4778]: I0318 09:51:47.228122 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerStarted","Data":"c27672d46e7320acefef5001c13794ee1f57d5d043df0f1d75875dbe02c9990c"} Mar 18 09:51:48 crc kubenswrapper[4778]: I0318 09:51:48.244759 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerStarted","Data":"5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4"} Mar 18 09:51:48 crc kubenswrapper[4778]: I0318 09:51:48.246884 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerStarted","Data":"571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913"} Mar 18 09:51:49 crc kubenswrapper[4778]: I0318 09:51:49.266234 4778 generic.go:334] "Generic (PLEG): container finished" podID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerID="571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913" exitCode=0 Mar 18 09:51:49 crc kubenswrapper[4778]: I0318 09:51:49.266470 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerDied","Data":"571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913"} Mar 18 09:51:50 crc kubenswrapper[4778]: I0318 09:51:50.278307 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerStarted","Data":"4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e"} Mar 18 09:51:50 crc kubenswrapper[4778]: I0318 09:51:50.306667 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fwdkk" podStartSLOduration=2.864637789 podStartE2EDuration="5.306644439s" podCreationTimestamp="2026-03-18 09:51:45 +0000 UTC" firstStartedPulling="2026-03-18 09:51:47.230305977 +0000 UTC m=+2973.805050817" lastFinishedPulling="2026-03-18 09:51:49.672312617 +0000 UTC m=+2976.247057467" observedRunningTime="2026-03-18 09:51:50.299214618 +0000 UTC m=+2976.873959478" watchObservedRunningTime="2026-03-18 09:51:50.306644439 +0000 UTC m=+2976.881389279" Mar 18 09:51:51 crc kubenswrapper[4778]: I0318 09:51:51.289642 4778 generic.go:334] "Generic (PLEG): container finished" podID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerID="5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4" exitCode=0 Mar 18 09:51:51 crc kubenswrapper[4778]: I0318 09:51:51.289698 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerDied","Data":"5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4"} Mar 18 09:51:52 crc kubenswrapper[4778]: I0318 09:51:52.300123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerStarted","Data":"869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7"} Mar 18 09:51:52 crc kubenswrapper[4778]: I0318 09:51:52.320509 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4xxv6" podStartSLOduration=2.680681912 podStartE2EDuration="7.320494517s" podCreationTimestamp="2026-03-18 09:51:45 +0000 UTC" firstStartedPulling="2026-03-18 09:51:47.226852524 +0000 UTC m=+2973.801597364" lastFinishedPulling="2026-03-18 09:51:51.866665109 +0000 UTC m=+2978.441409969" observedRunningTime="2026-03-18 09:51:52.317318961 +0000 UTC m=+2978.892063811" watchObservedRunningTime="2026-03-18 09:51:52.320494517 +0000 UTC m=+2978.895239357" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.012057 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.012687 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.061533 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.237830 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.237882 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.387661 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.875143 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwdkk"] Mar 18 09:51:57 crc kubenswrapper[4778]: I0318 09:51:57.294952 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4xxv6" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="registry-server" probeResult="failure" output=< Mar 18 09:51:57 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:51:57 crc kubenswrapper[4778]: > Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.345124 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fwdkk" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="registry-server" containerID="cri-o://4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e" gracePeriod=2 Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.830042 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.910311 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-utilities\") pod \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.910429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-catalog-content\") pod \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.910480 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26d47\" (UniqueName: \"kubernetes.io/projected/a91e5adc-fb5f-44af-9f4e-43c57ecece37-kube-api-access-26d47\") pod \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.912523 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-utilities" (OuterVolumeSpecName: "utilities") pod "a91e5adc-fb5f-44af-9f4e-43c57ecece37" (UID: "a91e5adc-fb5f-44af-9f4e-43c57ecece37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.916346 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a91e5adc-fb5f-44af-9f4e-43c57ecece37-kube-api-access-26d47" (OuterVolumeSpecName: "kube-api-access-26d47") pod "a91e5adc-fb5f-44af-9f4e-43c57ecece37" (UID: "a91e5adc-fb5f-44af-9f4e-43c57ecece37"). InnerVolumeSpecName "kube-api-access-26d47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.942454 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a91e5adc-fb5f-44af-9f4e-43c57ecece37" (UID: "a91e5adc-fb5f-44af-9f4e-43c57ecece37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.012023 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.012085 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26d47\" (UniqueName: \"kubernetes.io/projected/a91e5adc-fb5f-44af-9f4e-43c57ecece37-kube-api-access-26d47\") on node \"crc\" DevicePath \"\"" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.012101 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.355562 4778 generic.go:334] "Generic (PLEG): container finished" podID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerID="4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e" exitCode=0 Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.355605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerDied","Data":"4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e"} Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.355633 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerDied","Data":"c27672d46e7320acefef5001c13794ee1f57d5d043df0f1d75875dbe02c9990c"} Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.355649 4778 scope.go:117] "RemoveContainer" containerID="4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.355776 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.415898 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwdkk"] Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.425396 4778 scope.go:117] "RemoveContainer" containerID="571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.426311 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwdkk"] Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.475940 4778 scope.go:117] "RemoveContainer" containerID="63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.508826 4778 scope.go:117] "RemoveContainer" containerID="4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e" Mar 18 09:51:59 crc kubenswrapper[4778]: E0318 09:51:59.509338 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e\": container with ID starting with 4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e not found: ID does not exist" containerID="4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.509371 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e"} err="failed to get container status \"4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e\": rpc error: code = NotFound desc = could not find container \"4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e\": container with ID starting with 4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e not found: ID does not exist" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.509406 4778 scope.go:117] "RemoveContainer" containerID="571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913" Mar 18 09:51:59 crc kubenswrapper[4778]: E0318 09:51:59.509811 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913\": container with ID starting with 571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913 not found: ID does not exist" containerID="571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.509852 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913"} err="failed to get container status \"571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913\": rpc error: code = NotFound desc = could not find container \"571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913\": container with ID starting with 571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913 not found: ID does not exist" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.509868 4778 scope.go:117] "RemoveContainer" containerID="63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d" Mar 18 09:51:59 crc kubenswrapper[4778]: E0318 09:51:59.510123 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d\": container with ID starting with 63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d not found: ID does not exist" containerID="63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.510145 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d"} err="failed to get container status \"63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d\": rpc error: code = NotFound desc = could not find container \"63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d\": container with ID starting with 63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d not found: ID does not exist" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.147113 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.147180 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.161300 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563792-4g4zq"] Mar 18 09:52:00 crc kubenswrapper[4778]: E0318 09:52:00.161686 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="registry-server" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.161705 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="registry-server" Mar 18 09:52:00 crc kubenswrapper[4778]: E0318 09:52:00.161727 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="extract-content" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.161734 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="extract-content" Mar 18 09:52:00 crc kubenswrapper[4778]: E0318 09:52:00.161755 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="extract-utilities" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.161765 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="extract-utilities" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.161952 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="registry-server" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.162585 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.165478 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.165622 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.175779 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.178948 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563792-4g4zq"] Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.202579 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" path="/var/lib/kubelet/pods/a91e5adc-fb5f-44af-9f4e-43c57ecece37/volumes" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.235568 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm2lk\" (UniqueName: \"kubernetes.io/projected/a7095f92-8336-4c69-9c71-c3b9aa45bb82-kube-api-access-vm2lk\") pod \"auto-csr-approver-29563792-4g4zq\" (UID: \"a7095f92-8336-4c69-9c71-c3b9aa45bb82\") " pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.336652 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm2lk\" (UniqueName: \"kubernetes.io/projected/a7095f92-8336-4c69-9c71-c3b9aa45bb82-kube-api-access-vm2lk\") pod \"auto-csr-approver-29563792-4g4zq\" (UID: \"a7095f92-8336-4c69-9c71-c3b9aa45bb82\") " pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.354494 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm2lk\" (UniqueName: \"kubernetes.io/projected/a7095f92-8336-4c69-9c71-c3b9aa45bb82-kube-api-access-vm2lk\") pod \"auto-csr-approver-29563792-4g4zq\" (UID: \"a7095f92-8336-4c69-9c71-c3b9aa45bb82\") " pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.479677 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.919121 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563792-4g4zq"] Mar 18 09:52:01 crc kubenswrapper[4778]: I0318 09:52:01.374317 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" event={"ID":"a7095f92-8336-4c69-9c71-c3b9aa45bb82","Type":"ContainerStarted","Data":"23f7c02e5a9c6e20be9eb80c9194cf2d4fc7da4d970b1593d3b268df4135066a"} Mar 18 09:52:03 crc kubenswrapper[4778]: I0318 09:52:03.388354 4778 generic.go:334] "Generic (PLEG): container finished" podID="a7095f92-8336-4c69-9c71-c3b9aa45bb82" containerID="129d18099eafc9ec58cca914d6f8f45f3f345a43be2618db7b6619ab09177632" exitCode=0 Mar 18 09:52:03 crc kubenswrapper[4778]: I0318 09:52:03.388803 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" event={"ID":"a7095f92-8336-4c69-9c71-c3b9aa45bb82","Type":"ContainerDied","Data":"129d18099eafc9ec58cca914d6f8f45f3f345a43be2618db7b6619ab09177632"} Mar 18 09:52:04 crc kubenswrapper[4778]: I0318 09:52:04.760014 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:04 crc kubenswrapper[4778]: I0318 09:52:04.931413 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm2lk\" (UniqueName: \"kubernetes.io/projected/a7095f92-8336-4c69-9c71-c3b9aa45bb82-kube-api-access-vm2lk\") pod \"a7095f92-8336-4c69-9c71-c3b9aa45bb82\" (UID: \"a7095f92-8336-4c69-9c71-c3b9aa45bb82\") " Mar 18 09:52:04 crc kubenswrapper[4778]: I0318 09:52:04.937681 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7095f92-8336-4c69-9c71-c3b9aa45bb82-kube-api-access-vm2lk" (OuterVolumeSpecName: "kube-api-access-vm2lk") pod "a7095f92-8336-4c69-9c71-c3b9aa45bb82" (UID: "a7095f92-8336-4c69-9c71-c3b9aa45bb82"). InnerVolumeSpecName "kube-api-access-vm2lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:52:05 crc kubenswrapper[4778]: I0318 09:52:05.034008 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm2lk\" (UniqueName: \"kubernetes.io/projected/a7095f92-8336-4c69-9c71-c3b9aa45bb82-kube-api-access-vm2lk\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:05 crc kubenswrapper[4778]: I0318 09:52:05.422639 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" event={"ID":"a7095f92-8336-4c69-9c71-c3b9aa45bb82","Type":"ContainerDied","Data":"23f7c02e5a9c6e20be9eb80c9194cf2d4fc7da4d970b1593d3b268df4135066a"} Mar 18 09:52:05 crc kubenswrapper[4778]: I0318 09:52:05.423095 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23f7c02e5a9c6e20be9eb80c9194cf2d4fc7da4d970b1593d3b268df4135066a" Mar 18 09:52:05 crc kubenswrapper[4778]: I0318 09:52:05.423236 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:05 crc kubenswrapper[4778]: I0318 09:52:05.832800 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563786-wpjmv"] Mar 18 09:52:05 crc kubenswrapper[4778]: I0318 09:52:05.840765 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563786-wpjmv"] Mar 18 09:52:06 crc kubenswrapper[4778]: I0318 09:52:06.198356 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81f72c3-90fb-4526-97e3-977f3dbd00b0" path="/var/lib/kubelet/pods/e81f72c3-90fb-4526-97e3-977f3dbd00b0/volumes" Mar 18 09:52:06 crc kubenswrapper[4778]: I0318 09:52:06.289832 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:52:06 crc kubenswrapper[4778]: I0318 09:52:06.336551 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:52:06 crc kubenswrapper[4778]: I0318 09:52:06.524434 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xxv6"] Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.436431 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4xxv6" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="registry-server" containerID="cri-o://869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7" gracePeriod=2 Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.850089 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.885253 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgc6j\" (UniqueName: \"kubernetes.io/projected/c4b995fc-abe8-41af-9287-6381d6a3f37e-kube-api-access-xgc6j\") pod \"c4b995fc-abe8-41af-9287-6381d6a3f37e\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.885375 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-catalog-content\") pod \"c4b995fc-abe8-41af-9287-6381d6a3f37e\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.911223 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4b995fc-abe8-41af-9287-6381d6a3f37e-kube-api-access-xgc6j" (OuterVolumeSpecName: "kube-api-access-xgc6j") pod "c4b995fc-abe8-41af-9287-6381d6a3f37e" (UID: "c4b995fc-abe8-41af-9287-6381d6a3f37e"). InnerVolumeSpecName "kube-api-access-xgc6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.986952 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-utilities\") pod \"c4b995fc-abe8-41af-9287-6381d6a3f37e\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.987693 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgc6j\" (UniqueName: \"kubernetes.io/projected/c4b995fc-abe8-41af-9287-6381d6a3f37e-kube-api-access-xgc6j\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.987919 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-utilities" (OuterVolumeSpecName: "utilities") pod "c4b995fc-abe8-41af-9287-6381d6a3f37e" (UID: "c4b995fc-abe8-41af-9287-6381d6a3f37e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.018069 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4b995fc-abe8-41af-9287-6381d6a3f37e" (UID: "c4b995fc-abe8-41af-9287-6381d6a3f37e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.231139 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.231254 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.447499 4778 generic.go:334] "Generic (PLEG): container finished" podID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerID="869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7" exitCode=0 Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.447553 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerDied","Data":"869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7"} Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.447827 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerDied","Data":"89f244671c9af7d58f97b9b32aa5636eb413ce69e26ae52cac5094e2618def32"} Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.447844 4778 scope.go:117] "RemoveContainer" containerID="869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.447569 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.467472 4778 scope.go:117] "RemoveContainer" containerID="5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.470408 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xxv6"] Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.478149 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4xxv6"] Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.488714 4778 scope.go:117] "RemoveContainer" containerID="2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.528186 4778 scope.go:117] "RemoveContainer" containerID="869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7" Mar 18 09:52:08 crc kubenswrapper[4778]: E0318 09:52:08.528724 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7\": container with ID starting with 869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7 not found: ID does not exist" containerID="869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.528759 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7"} err="failed to get container status \"869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7\": rpc error: code = NotFound desc = could not find container \"869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7\": container with ID starting with 869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7 not found: ID does not exist" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.528787 4778 scope.go:117] "RemoveContainer" containerID="5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4" Mar 18 09:52:08 crc kubenswrapper[4778]: E0318 09:52:08.529254 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4\": container with ID starting with 5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4 not found: ID does not exist" containerID="5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.529283 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4"} err="failed to get container status \"5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4\": rpc error: code = NotFound desc = could not find container \"5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4\": container with ID starting with 5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4 not found: ID does not exist" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.529302 4778 scope.go:117] "RemoveContainer" containerID="2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454" Mar 18 09:52:08 crc kubenswrapper[4778]: E0318 09:52:08.529586 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454\": container with ID starting with 2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454 not found: ID does not exist" containerID="2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.529637 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454"} err="failed to get container status \"2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454\": rpc error: code = NotFound desc = could not find container \"2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454\": container with ID starting with 2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454 not found: ID does not exist" Mar 18 09:52:10 crc kubenswrapper[4778]: I0318 09:52:10.207434 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" path="/var/lib/kubelet/pods/c4b995fc-abe8-41af-9287-6381d6a3f37e/volumes" Mar 18 09:52:30 crc kubenswrapper[4778]: I0318 09:52:30.148839 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:52:30 crc kubenswrapper[4778]: I0318 09:52:30.149535 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:52:32 crc kubenswrapper[4778]: I0318 09:52:32.877563 4778 scope.go:117] "RemoveContainer" containerID="f0336ddfd7a0dbb015d37a5f5151d0bd63e8c2d9a92eb6c0cfc48a0cb9420252" Mar 18 09:52:57 crc kubenswrapper[4778]: I0318 09:52:57.924059 4778 generic.go:334] "Generic (PLEG): container finished" podID="b2db5491-57b4-427a-b306-5e525a1e7c27" containerID="3b076b40e8f07ea23b1427f17330f5415549c41b6c6f9192b8ef848601e5be2b" exitCode=0 Mar 18 09:52:57 crc kubenswrapper[4778]: I0318 09:52:57.924162 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" event={"ID":"b2db5491-57b4-427a-b306-5e525a1e7c27","Type":"ContainerDied","Data":"3b076b40e8f07ea23b1427f17330f5415549c41b6c6f9192b8ef848601e5be2b"} Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.394694 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552268 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-inventory\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552384 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6zxg\" (UniqueName: \"kubernetes.io/projected/b2db5491-57b4-427a-b306-5e525a1e7c27-kube-api-access-z6zxg\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552422 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-0\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552442 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-3\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552465 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-1\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552491 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-2\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552533 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-1\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552554 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-0\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552599 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-custom-ceph-combined-ca-bundle\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552646 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-extra-config-0\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552671 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ssh-key-openstack-edpm-ipam\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552730 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph-nova-0\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.558486 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.558550 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph" (OuterVolumeSpecName: "ceph") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.559642 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2db5491-57b4-427a-b306-5e525a1e7c27-kube-api-access-z6zxg" (OuterVolumeSpecName: "kube-api-access-z6zxg") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "kube-api-access-z6zxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.578237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.585705 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.586157 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.588569 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.590381 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.593392 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.594572 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.597440 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-inventory" (OuterVolumeSpecName: "inventory") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.598356 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.605357 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656048 4778 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656115 4778 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656137 4778 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656158 4778 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656177 4778 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656223 4778 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656246 4778 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656266 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656287 4778 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656306 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656329 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656347 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6zxg\" (UniqueName: \"kubernetes.io/projected/b2db5491-57b4-427a-b306-5e525a1e7c27-kube-api-access-z6zxg\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656368 4778 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.949333 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" event={"ID":"b2db5491-57b4-427a-b306-5e525a1e7c27","Type":"ContainerDied","Data":"ae914e239ff54eca2bb96c1bbf0bed7d47de287f780b43c281d7c1dcccb9c71c"} Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.949660 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae914e239ff54eca2bb96c1bbf0bed7d47de287f780b43c281d7c1dcccb9c71c" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.949436 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.147592 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.147916 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.148045 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.148904 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.149057 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" gracePeriod=600 Mar 18 09:53:00 crc kubenswrapper[4778]: E0318 09:53:00.268005 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.958532 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" exitCode=0 Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.959418 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3"} Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.959527 4778 scope.go:117] "RemoveContainer" containerID="5df8914ca5e61db32f1727722fb361804c23982bfeee84574751dabf5cd01a2d" Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.960153 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:53:00 crc kubenswrapper[4778]: E0318 09:53:00.960451 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:53:12 crc kubenswrapper[4778]: I0318 09:53:12.187921 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:53:12 crc kubenswrapper[4778]: E0318 09:53:12.188932 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.337422 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 18 09:53:14 crc kubenswrapper[4778]: E0318 09:53:14.338250 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="extract-content" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338270 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="extract-content" Mar 18 09:53:14 crc kubenswrapper[4778]: E0318 09:53:14.338288 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="registry-server" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338297 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="registry-server" Mar 18 09:53:14 crc kubenswrapper[4778]: E0318 09:53:14.338305 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="extract-utilities" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338313 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="extract-utilities" Mar 18 09:53:14 crc kubenswrapper[4778]: E0318 09:53:14.338333 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2db5491-57b4-427a-b306-5e525a1e7c27" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338342 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2db5491-57b4-427a-b306-5e525a1e7c27" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 18 09:53:14 crc kubenswrapper[4778]: E0318 09:53:14.338365 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7095f92-8336-4c69-9c71-c3b9aa45bb82" containerName="oc" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338373 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7095f92-8336-4c69-9c71-c3b9aa45bb82" containerName="oc" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338598 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="registry-server" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338626 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2db5491-57b4-427a-b306-5e525a1e7c27" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338643 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7095f92-8336-4c69-9c71-c3b9aa45bb82" containerName="oc" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.339769 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.341799 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.342025 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.346855 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.348267 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.350185 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.364552 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.425608 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.471894 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.471937 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.471956 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/81d18509-d2fc-47e2-b814-94c4807a4dd6-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.471980 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.471996 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-dev\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472026 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472068 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-ceph\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472083 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-dev\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472114 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472132 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472145 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-lib-modules\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472209 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472225 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472244 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472268 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472317 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472339 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472357 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472405 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-scripts\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472448 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-config-data\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-sys\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472510 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpgcf\" (UniqueName: \"kubernetes.io/projected/81d18509-d2fc-47e2-b814-94c4807a4dd6-kube-api-access-xpgcf\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472537 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472565 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvphs\" (UniqueName: \"kubernetes.io/projected/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-kube-api-access-bvphs\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472605 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472627 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472681 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-run\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472717 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-run\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472750 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472773 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472796 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-sys\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.574550 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-sys\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.574851 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpgcf\" (UniqueName: \"kubernetes.io/projected/81d18509-d2fc-47e2-b814-94c4807a4dd6-kube-api-access-xpgcf\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.574684 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-sys\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.574961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575051 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvphs\" (UniqueName: \"kubernetes.io/projected/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-kube-api-access-bvphs\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575144 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575189 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575252 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575290 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-run\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575318 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-run\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575341 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575364 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575389 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-run\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575401 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-sys\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575410 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-run\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575434 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-sys\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575454 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575500 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/81d18509-d2fc-47e2-b814-94c4807a4dd6-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575490 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575457 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575524 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575647 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-dev\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575702 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575710 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575762 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575786 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-ceph\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-dev\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575821 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575876 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-dev\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576057 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-lib-modules\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576084 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576093 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-dev\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576155 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576180 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-lib-modules\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576185 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576230 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576288 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576329 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576412 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576461 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576483 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576529 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-scripts\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576580 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-config-data\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.577011 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.577060 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.577143 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.577460 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.577494 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.577523 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.582094 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-scripts\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.582550 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-ceph\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.583342 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.583398 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.583909 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.589284 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.589424 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.590043 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.591769 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/81d18509-d2fc-47e2-b814-94c4807a4dd6-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.592352 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-config-data\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.595307 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpgcf\" (UniqueName: \"kubernetes.io/projected/81d18509-d2fc-47e2-b814-94c4807a4dd6-kube-api-access-xpgcf\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.595367 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvphs\" (UniqueName: \"kubernetes.io/projected/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-kube-api-access-bvphs\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.664547 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.724797 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.846010 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-j5mf6"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.848871 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.859770 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-j5mf6"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.946760 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-91af-account-create-update-cc4d5"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.948407 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.952739 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.964666 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-91af-account-create-update-cc4d5"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.984390 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wmjx\" (UniqueName: \"kubernetes.io/projected/57dd6190-5149-44a9-8a75-7e3d9077a43c-kube-api-access-9wmjx\") pod \"manila-db-create-j5mf6\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.984489 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dd6190-5149-44a9-8a75-7e3d9077a43c-operator-scripts\") pod \"manila-db-create-j5mf6\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.085816 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skprq\" (UniqueName: \"kubernetes.io/projected/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-kube-api-access-skprq\") pod \"manila-91af-account-create-update-cc4d5\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.085993 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmjx\" (UniqueName: \"kubernetes.io/projected/57dd6190-5149-44a9-8a75-7e3d9077a43c-kube-api-access-9wmjx\") pod \"manila-db-create-j5mf6\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.086073 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dd6190-5149-44a9-8a75-7e3d9077a43c-operator-scripts\") pod \"manila-db-create-j5mf6\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.086099 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-operator-scripts\") pod \"manila-91af-account-create-update-cc4d5\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.087760 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dd6190-5149-44a9-8a75-7e3d9077a43c-operator-scripts\") pod \"manila-db-create-j5mf6\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.107307 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wmjx\" (UniqueName: \"kubernetes.io/projected/57dd6190-5149-44a9-8a75-7e3d9077a43c-kube-api-access-9wmjx\") pod \"manila-db-create-j5mf6\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.130438 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.133134 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.135990 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.135990 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.136348 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ntc8r" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.136411 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.142276 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.182318 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.189884 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-operator-scripts\") pod \"manila-91af-account-create-update-cc4d5\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.190064 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skprq\" (UniqueName: \"kubernetes.io/projected/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-kube-api-access-skprq\") pod \"manila-91af-account-create-update-cc4d5\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.191017 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-operator-scripts\") pod \"manila-91af-account-create-update-cc4d5\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.196312 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.198247 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.200879 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.201150 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.204254 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.217388 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skprq\" (UniqueName: \"kubernetes.io/projected/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-kube-api-access-skprq\") pod \"manila-91af-account-create-update-cc4d5\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.271036 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293017 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-logs\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293149 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293182 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z5mf\" (UniqueName: \"kubernetes.io/projected/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-kube-api-access-6z5mf\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293256 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293291 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293354 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293462 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293523 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293603 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-ceph\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.351995 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.397591 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.397676 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.397718 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.397764 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.397799 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400351 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsj6f\" (UniqueName: \"kubernetes.io/projected/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-kube-api-access-gsj6f\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400710 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400778 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400832 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400878 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400977 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-ceph\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.401113 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-logs\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.401175 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-logs\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.401310 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.401349 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z5mf\" (UniqueName: \"kubernetes.io/projected/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-kube-api-access-6z5mf\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.401417 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.401451 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.404317 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.405311 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-logs\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.407408 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.408089 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-ceph\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.408854 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.409064 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.409955 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.413272 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.422845 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z5mf\" (UniqueName: \"kubernetes.io/projected/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-kube-api-access-6z5mf\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: W0318 09:53:15.431010 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81d18509_d2fc_47e2_b814_94c4807a4dd6.slice/crio-0d11d6b09e6f4473921af3645ab9916dc5f7e65427d3182e0f95832cfffd36ea WatchSource:0}: Error finding container 0d11d6b09e6f4473921af3645ab9916dc5f7e65427d3182e0f95832cfffd36ea: Status 404 returned error can't find the container with id 0d11d6b09e6f4473921af3645ab9916dc5f7e65427d3182e0f95832cfffd36ea Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.432284 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.442688 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.457076 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.503798 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.503858 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.503927 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.503951 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsj6f\" (UniqueName: \"kubernetes.io/projected/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-kube-api-access-gsj6f\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.504000 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.504080 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.504135 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.504240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-logs\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.504284 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.505114 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.505727 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.506566 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-logs\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.509884 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.511987 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.512072 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.513470 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.521410 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.524501 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsj6f\" (UniqueName: \"kubernetes.io/projected/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-kube-api-access-gsj6f\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.561869 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.645611 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-j5mf6"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.762808 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-91af-account-create-update-cc4d5"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.821478 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.126910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13","Type":"ContainerStarted","Data":"8385277fcdfe2cd15730c0a5c485b06db1e943ab2dc423f994281197c507a39e"} Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.149721 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.151526 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-j5mf6" event={"ID":"57dd6190-5149-44a9-8a75-7e3d9077a43c","Type":"ContainerStarted","Data":"fc0fef996b4b9a5437de59c8b2ac8a5e7d95ba6ac33a74f54e7f79985c001e66"} Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.151582 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-j5mf6" event={"ID":"57dd6190-5149-44a9-8a75-7e3d9077a43c","Type":"ContainerStarted","Data":"a42c12386bc9161895c66f5178800674e5acdcd3879ece3c4d3b64226793baed"} Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.160516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-91af-account-create-update-cc4d5" event={"ID":"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0","Type":"ContainerStarted","Data":"a03567ec27a1b447b64f31d017f135a530faf96727b9cdb25f25df3c2b11ab27"} Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.160575 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-91af-account-create-update-cc4d5" event={"ID":"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0","Type":"ContainerStarted","Data":"4e079b381ae16533a0c4f019135452d3abe9616c161f7f866e0e5b17f9a6ed6d"} Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.164695 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"81d18509-d2fc-47e2-b814-94c4807a4dd6","Type":"ContainerStarted","Data":"0d11d6b09e6f4473921af3645ab9916dc5f7e65427d3182e0f95832cfffd36ea"} Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.184926 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-j5mf6" podStartSLOduration=2.184908718 podStartE2EDuration="2.184908718s" podCreationTimestamp="2026-03-18 09:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:16.179664965 +0000 UTC m=+3062.754409815" watchObservedRunningTime="2026-03-18 09:53:16.184908718 +0000 UTC m=+3062.759653558" Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.202172 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-91af-account-create-update-cc4d5" podStartSLOduration=2.202151686 podStartE2EDuration="2.202151686s" podCreationTimestamp="2026-03-18 09:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:16.193735647 +0000 UTC m=+3062.768480487" watchObservedRunningTime="2026-03-18 09:53:16.202151686 +0000 UTC m=+3062.776896526" Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.412131 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 09:53:16 crc kubenswrapper[4778]: W0318 09:53:16.420561 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda18a46b5_39a7_4da9_8994_5c4716bc0fc3.slice/crio-c470e3848c8d0d6eb31f8ce6345f18395471221c351cc8d486bcd5ef90ac1c27 WatchSource:0}: Error finding container c470e3848c8d0d6eb31f8ce6345f18395471221c351cc8d486bcd5ef90ac1c27: Status 404 returned error can't find the container with id c470e3848c8d0d6eb31f8ce6345f18395471221c351cc8d486bcd5ef90ac1c27 Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.189585 4778 generic.go:334] "Generic (PLEG): container finished" podID="c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" containerID="a03567ec27a1b447b64f31d017f135a530faf96727b9cdb25f25df3c2b11ab27" exitCode=0 Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.190168 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-91af-account-create-update-cc4d5" event={"ID":"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0","Type":"ContainerDied","Data":"a03567ec27a1b447b64f31d017f135a530faf96727b9cdb25f25df3c2b11ab27"} Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.203070 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"81d18509-d2fc-47e2-b814-94c4807a4dd6","Type":"ContainerStarted","Data":"f67a1fbff496f1be1b4a5ab25667b7894e5bc479f79274dc8a68a3e3b0d0f449"} Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.225389 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a18a46b5-39a7-4da9-8994-5c4716bc0fc3","Type":"ContainerStarted","Data":"1bdefb86686f4948b49677f54ecf66216845e261d7b9635b47c514601effcb59"} Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.225447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a18a46b5-39a7-4da9-8994-5c4716bc0fc3","Type":"ContainerStarted","Data":"c470e3848c8d0d6eb31f8ce6345f18395471221c351cc8d486bcd5ef90ac1c27"} Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.230890 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd","Type":"ContainerStarted","Data":"29bf8f14f65ee985d6e1a81ccba68541643fe934836bc5811a00b9cf6e9d3255"} Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.231004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd","Type":"ContainerStarted","Data":"83a869fb7c2b3e59aa1a1dfb377b28fef14e4860992724e6ab3c0e03c5ed2d14"} Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.235854 4778 generic.go:334] "Generic (PLEG): container finished" podID="57dd6190-5149-44a9-8a75-7e3d9077a43c" containerID="fc0fef996b4b9a5437de59c8b2ac8a5e7d95ba6ac33a74f54e7f79985c001e66" exitCode=0 Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.235907 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-j5mf6" event={"ID":"57dd6190-5149-44a9-8a75-7e3d9077a43c","Type":"ContainerDied","Data":"fc0fef996b4b9a5437de59c8b2ac8a5e7d95ba6ac33a74f54e7f79985c001e66"} Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.245109 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd","Type":"ContainerStarted","Data":"33986289d3f8e0f6641360b3e47ccb80fb9d1a8d78d05c15ad5cb1a1e9b60ccf"} Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.248152 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"81d18509-d2fc-47e2-b814-94c4807a4dd6","Type":"ContainerStarted","Data":"80de6fde3eb7f840dfb0144a8e27c20ffddf1efc593764891f2df40651f3dd80"} Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.250483 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13","Type":"ContainerStarted","Data":"810e9cda4787b4046333e5e6b466e35bd8a8305077a7123eae4208c665d2544b"} Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.250637 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13","Type":"ContainerStarted","Data":"1f3e135394d7ac164c95b4e8c0d3adbf4f449320b8725260d7d3e5a8a98d53b5"} Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.254644 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a18a46b5-39a7-4da9-8994-5c4716bc0fc3","Type":"ContainerStarted","Data":"69b7e9d6fec514c78d542744cb69ed2d17add8756390d1b40ae8a6652b4fe172"} Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.281315 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.281296368 podStartE2EDuration="4.281296368s" podCreationTimestamp="2026-03-18 09:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:18.271564694 +0000 UTC m=+3064.846309554" watchObservedRunningTime="2026-03-18 09:53:18.281296368 +0000 UTC m=+3064.856041208" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.312140 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.832388597 podStartE2EDuration="4.312120085s" podCreationTimestamp="2026-03-18 09:53:14 +0000 UTC" firstStartedPulling="2026-03-18 09:53:15.434134132 +0000 UTC m=+3062.008878972" lastFinishedPulling="2026-03-18 09:53:16.91386562 +0000 UTC m=+3063.488610460" observedRunningTime="2026-03-18 09:53:18.308152488 +0000 UTC m=+3064.882897348" watchObservedRunningTime="2026-03-18 09:53:18.312120085 +0000 UTC m=+3064.886864925" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.340300 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.3402791 podStartE2EDuration="4.3402791s" podCreationTimestamp="2026-03-18 09:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:18.32848418 +0000 UTC m=+3064.903229040" watchObservedRunningTime="2026-03-18 09:53:18.3402791 +0000 UTC m=+3064.915023940" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.397463 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.821254175 podStartE2EDuration="4.397445654s" podCreationTimestamp="2026-03-18 09:53:14 +0000 UTC" firstStartedPulling="2026-03-18 09:53:15.337337262 +0000 UTC m=+3061.912082102" lastFinishedPulling="2026-03-18 09:53:16.913528741 +0000 UTC m=+3063.488273581" observedRunningTime="2026-03-18 09:53:18.351878415 +0000 UTC m=+3064.926623255" watchObservedRunningTime="2026-03-18 09:53:18.397445654 +0000 UTC m=+3064.972190494" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.582990 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.692650 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dd6190-5149-44a9-8a75-7e3d9077a43c-operator-scripts\") pod \"57dd6190-5149-44a9-8a75-7e3d9077a43c\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.693020 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wmjx\" (UniqueName: \"kubernetes.io/projected/57dd6190-5149-44a9-8a75-7e3d9077a43c-kube-api-access-9wmjx\") pod \"57dd6190-5149-44a9-8a75-7e3d9077a43c\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.693754 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57dd6190-5149-44a9-8a75-7e3d9077a43c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57dd6190-5149-44a9-8a75-7e3d9077a43c" (UID: "57dd6190-5149-44a9-8a75-7e3d9077a43c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.697834 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57dd6190-5149-44a9-8a75-7e3d9077a43c-kube-api-access-9wmjx" (OuterVolumeSpecName: "kube-api-access-9wmjx") pod "57dd6190-5149-44a9-8a75-7e3d9077a43c" (UID: "57dd6190-5149-44a9-8a75-7e3d9077a43c"). InnerVolumeSpecName "kube-api-access-9wmjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.758356 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.796771 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dd6190-5149-44a9-8a75-7e3d9077a43c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.796807 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wmjx\" (UniqueName: \"kubernetes.io/projected/57dd6190-5149-44a9-8a75-7e3d9077a43c-kube-api-access-9wmjx\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.897786 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skprq\" (UniqueName: \"kubernetes.io/projected/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-kube-api-access-skprq\") pod \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.897968 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-operator-scripts\") pod \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.898390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" (UID: "c9f651dd-ff4a-46c9-bd8c-0155be07f0a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.898693 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.902130 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-kube-api-access-skprq" (OuterVolumeSpecName: "kube-api-access-skprq") pod "c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" (UID: "c9f651dd-ff4a-46c9-bd8c-0155be07f0a0"). InnerVolumeSpecName "kube-api-access-skprq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.000734 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skprq\" (UniqueName: \"kubernetes.io/projected/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-kube-api-access-skprq\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.272063 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-j5mf6" event={"ID":"57dd6190-5149-44a9-8a75-7e3d9077a43c","Type":"ContainerDied","Data":"a42c12386bc9161895c66f5178800674e5acdcd3879ece3c4d3b64226793baed"} Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.272095 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.272102 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a42c12386bc9161895c66f5178800674e5acdcd3879ece3c4d3b64226793baed" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.273539 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-91af-account-create-update-cc4d5" event={"ID":"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0","Type":"ContainerDied","Data":"4e079b381ae16533a0c4f019135452d3abe9616c161f7f866e0e5b17f9a6ed6d"} Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.273568 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.273591 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e079b381ae16533a0c4f019135452d3abe9616c161f7f866e0e5b17f9a6ed6d" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.665012 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.725466 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.221098 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-fnlvs"] Mar 18 09:53:20 crc kubenswrapper[4778]: E0318 09:53:20.221750 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57dd6190-5149-44a9-8a75-7e3d9077a43c" containerName="mariadb-database-create" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.221773 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57dd6190-5149-44a9-8a75-7e3d9077a43c" containerName="mariadb-database-create" Mar 18 09:53:20 crc kubenswrapper[4778]: E0318 09:53:20.221823 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" containerName="mariadb-account-create-update" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.221833 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" containerName="mariadb-account-create-update" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.222066 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" containerName="mariadb-account-create-update" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.222091 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="57dd6190-5149-44a9-8a75-7e3d9077a43c" containerName="mariadb-database-create" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.222832 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.225909 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-fnlvs"] Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.226644 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-t68kd" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.226732 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.329932 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjgrm\" (UniqueName: \"kubernetes.io/projected/86f9ef2c-6a05-438a-a701-92c9ef84d46d-kube-api-access-qjgrm\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.330036 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-job-config-data\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.330070 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-combined-ca-bundle\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.330102 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-config-data\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.431952 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-job-config-data\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.432025 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-combined-ca-bundle\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.432075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-config-data\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.432346 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjgrm\" (UniqueName: \"kubernetes.io/projected/86f9ef2c-6a05-438a-a701-92c9ef84d46d-kube-api-access-qjgrm\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.440987 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-combined-ca-bundle\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.441178 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-config-data\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.441260 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-job-config-data\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.452147 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjgrm\" (UniqueName: \"kubernetes.io/projected/86f9ef2c-6a05-438a-a701-92c9ef84d46d-kube-api-access-qjgrm\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.553742 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:21 crc kubenswrapper[4778]: I0318 09:53:21.104929 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-fnlvs"] Mar 18 09:53:21 crc kubenswrapper[4778]: W0318 09:53:21.114539 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86f9ef2c_6a05_438a_a701_92c9ef84d46d.slice/crio-384420f85ddf082ceab20b4626cf92ea5c7d09153e3ebeaaa51ba2c61916038c WatchSource:0}: Error finding container 384420f85ddf082ceab20b4626cf92ea5c7d09153e3ebeaaa51ba2c61916038c: Status 404 returned error can't find the container with id 384420f85ddf082ceab20b4626cf92ea5c7d09153e3ebeaaa51ba2c61916038c Mar 18 09:53:21 crc kubenswrapper[4778]: I0318 09:53:21.291682 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fnlvs" event={"ID":"86f9ef2c-6a05-438a-a701-92c9ef84d46d","Type":"ContainerStarted","Data":"384420f85ddf082ceab20b4626cf92ea5c7d09153e3ebeaaa51ba2c61916038c"} Mar 18 09:53:24 crc kubenswrapper[4778]: I0318 09:53:24.869171 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 18 09:53:24 crc kubenswrapper[4778]: I0318 09:53:24.949642 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.187052 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:53:25 crc kubenswrapper[4778]: E0318 09:53:25.187372 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.458165 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.458401 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.496147 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.516598 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.822678 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.822736 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.861679 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.865144 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:26 crc kubenswrapper[4778]: I0318 09:53:26.337336 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:26 crc kubenswrapper[4778]: I0318 09:53:26.337665 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 09:53:26 crc kubenswrapper[4778]: I0318 09:53:26.337680 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:26 crc kubenswrapper[4778]: I0318 09:53:26.337691 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 09:53:27 crc kubenswrapper[4778]: I0318 09:53:27.346810 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fnlvs" event={"ID":"86f9ef2c-6a05-438a-a701-92c9ef84d46d","Type":"ContainerStarted","Data":"c09837191b6ba16c2c4c1ba6934469f4e373f1877eaa0a419ae86d94a526194d"} Mar 18 09:53:27 crc kubenswrapper[4778]: I0318 09:53:27.371542 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-fnlvs" podStartSLOduration=2.499737757 podStartE2EDuration="7.371512563s" podCreationTimestamp="2026-03-18 09:53:20 +0000 UTC" firstStartedPulling="2026-03-18 09:53:21.119072619 +0000 UTC m=+3067.693817459" lastFinishedPulling="2026-03-18 09:53:25.990847425 +0000 UTC m=+3072.565592265" observedRunningTime="2026-03-18 09:53:27.363171606 +0000 UTC m=+3073.937916466" watchObservedRunningTime="2026-03-18 09:53:27.371512563 +0000 UTC m=+3073.946257413" Mar 18 09:53:28 crc kubenswrapper[4778]: I0318 09:53:28.394458 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:28 crc kubenswrapper[4778]: I0318 09:53:28.394833 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 09:53:28 crc kubenswrapper[4778]: I0318 09:53:28.475655 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 09:53:28 crc kubenswrapper[4778]: I0318 09:53:28.475818 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 09:53:28 crc kubenswrapper[4778]: I0318 09:53:28.507765 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 09:53:28 crc kubenswrapper[4778]: I0318 09:53:28.527738 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:35 crc kubenswrapper[4778]: E0318 09:53:35.817402 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86f9ef2c_6a05_438a_a701_92c9ef84d46d.slice/crio-c09837191b6ba16c2c4c1ba6934469f4e373f1877eaa0a419ae86d94a526194d.scope\": RecentStats: unable to find data in memory cache]" Mar 18 09:53:36 crc kubenswrapper[4778]: I0318 09:53:36.440854 4778 generic.go:334] "Generic (PLEG): container finished" podID="86f9ef2c-6a05-438a-a701-92c9ef84d46d" containerID="c09837191b6ba16c2c4c1ba6934469f4e373f1877eaa0a419ae86d94a526194d" exitCode=0 Mar 18 09:53:36 crc kubenswrapper[4778]: I0318 09:53:36.440901 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fnlvs" event={"ID":"86f9ef2c-6a05-438a-a701-92c9ef84d46d","Type":"ContainerDied","Data":"c09837191b6ba16c2c4c1ba6934469f4e373f1877eaa0a419ae86d94a526194d"} Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.902114 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.969743 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjgrm\" (UniqueName: \"kubernetes.io/projected/86f9ef2c-6a05-438a-a701-92c9ef84d46d-kube-api-access-qjgrm\") pod \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.969900 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-combined-ca-bundle\") pod \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.969972 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-job-config-data\") pod \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.970032 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-config-data\") pod \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.978408 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f9ef2c-6a05-438a-a701-92c9ef84d46d-kube-api-access-qjgrm" (OuterVolumeSpecName: "kube-api-access-qjgrm") pod "86f9ef2c-6a05-438a-a701-92c9ef84d46d" (UID: "86f9ef2c-6a05-438a-a701-92c9ef84d46d"). InnerVolumeSpecName "kube-api-access-qjgrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.978518 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "86f9ef2c-6a05-438a-a701-92c9ef84d46d" (UID: "86f9ef2c-6a05-438a-a701-92c9ef84d46d"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.985577 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-config-data" (OuterVolumeSpecName: "config-data") pod "86f9ef2c-6a05-438a-a701-92c9ef84d46d" (UID: "86f9ef2c-6a05-438a-a701-92c9ef84d46d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.010121 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86f9ef2c-6a05-438a-a701-92c9ef84d46d" (UID: "86f9ef2c-6a05-438a-a701-92c9ef84d46d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.072917 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjgrm\" (UniqueName: \"kubernetes.io/projected/86f9ef2c-6a05-438a-a701-92c9ef84d46d-kube-api-access-qjgrm\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.072961 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.072973 4778 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.072986 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.461601 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fnlvs" event={"ID":"86f9ef2c-6a05-438a-a701-92c9ef84d46d","Type":"ContainerDied","Data":"384420f85ddf082ceab20b4626cf92ea5c7d09153e3ebeaaa51ba2c61916038c"} Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.461648 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="384420f85ddf082ceab20b4626cf92ea5c7d09153e3ebeaaa51ba2c61916038c" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.461657 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.749153 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:53:38 crc kubenswrapper[4778]: E0318 09:53:38.749866 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f9ef2c-6a05-438a-a701-92c9ef84d46d" containerName="manila-db-sync" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.749879 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f9ef2c-6a05-438a-a701-92c9ef84d46d" containerName="manila-db-sync" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.750050 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f9ef2c-6a05-438a-a701-92c9ef84d46d" containerName="manila-db-sync" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.751001 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.752291 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-t68kd" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.752780 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.753457 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.755312 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.759945 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.761771 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.763364 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.770237 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.778382 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.907117 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-pr7j8"] Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.908645 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909787 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909856 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm9bt\" (UniqueName: \"kubernetes.io/projected/e96ea8ef-858b-421c-9b80-c8e76e2bc368-kube-api-access-qm9bt\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909877 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-scripts\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909899 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909914 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96ea8ef-858b-421c-9b80-c8e76e2bc368-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909941 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909965 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909986 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.910018 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.910054 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.910073 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-ceph\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.910102 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frmhc\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-kube-api-access-frmhc\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.910118 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-scripts\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.910136 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.924164 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-pr7j8"] Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011606 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-scripts\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011657 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011676 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96ea8ef-858b-421c-9b80-c8e76e2bc368-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011708 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011735 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011757 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011782 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85qbb\" (UniqueName: \"kubernetes.io/projected/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-kube-api-access-85qbb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011802 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011817 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011937 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012028 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012029 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96ea8ef-858b-421c-9b80-c8e76e2bc368-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012068 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-config\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012152 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-ceph\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012215 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012215 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012780 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frmhc\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-kube-api-access-frmhc\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012811 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-scripts\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012832 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012853 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012903 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012943 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.013003 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm9bt\" (UniqueName: \"kubernetes.io/projected/e96ea8ef-858b-421c-9b80-c8e76e2bc368-kube-api-access-qm9bt\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.018442 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-scripts\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.018886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.019495 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.021361 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.021748 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.022501 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-ceph\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.029063 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.032290 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm9bt\" (UniqueName: \"kubernetes.io/projected/e96ea8ef-858b-421c-9b80-c8e76e2bc368-kube-api-access-qm9bt\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.034423 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.035149 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-scripts\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.050492 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frmhc\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-kube-api-access-frmhc\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.070139 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.080245 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.114986 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.115062 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-config\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.115091 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.115136 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.115242 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.115369 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85qbb\" (UniqueName: \"kubernetes.io/projected/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-kube-api-access-85qbb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.116157 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.116333 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-config\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.116353 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.116559 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.116811 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.135367 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85qbb\" (UniqueName: \"kubernetes.io/projected/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-kube-api-access-85qbb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.164271 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.167509 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.183787 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.196674 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.228116 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319311 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc20709-2b88-4657-930d-2f893754bc1b-logs\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319368 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k982x\" (UniqueName: \"kubernetes.io/projected/9bc20709-2b88-4657-930d-2f893754bc1b-kube-api-access-k982x\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319392 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319517 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-scripts\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319775 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data-custom\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319806 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc20709-2b88-4657-930d-2f893754bc1b-etc-machine-id\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421330 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k982x\" (UniqueName: \"kubernetes.io/projected/9bc20709-2b88-4657-930d-2f893754bc1b-kube-api-access-k982x\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421382 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421405 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-scripts\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421436 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421508 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data-custom\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421529 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc20709-2b88-4657-930d-2f893754bc1b-etc-machine-id\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc20709-2b88-4657-930d-2f893754bc1b-logs\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421913 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc20709-2b88-4657-930d-2f893754bc1b-logs\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.424277 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc20709-2b88-4657-930d-2f893754bc1b-etc-machine-id\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.440983 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.442997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.443117 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data-custom\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.444869 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-scripts\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.446575 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k982x\" (UniqueName: \"kubernetes.io/projected/9bc20709-2b88-4657-930d-2f893754bc1b-kube-api-access-k982x\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.551776 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.658641 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.691779 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.842851 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-pr7j8"] Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.187433 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:53:40 crc kubenswrapper[4778]: E0318 09:53:40.188332 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.238000 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.483281 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e96ea8ef-858b-421c-9b80-c8e76e2bc368","Type":"ContainerStarted","Data":"0b6591d1b2d8334e5cf9a42b169e44b6dfae4bb6ceaf223f510fc2ac1f46c6e1"} Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.485060 4778 generic.go:334] "Generic (PLEG): container finished" podID="78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3" containerID="9f299a31d1798531e91bd8bb58c97e154d95fd9d6d916aab9cc49dafb47ef130" exitCode=0 Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.485121 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" event={"ID":"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3","Type":"ContainerDied","Data":"9f299a31d1798531e91bd8bb58c97e154d95fd9d6d916aab9cc49dafb47ef130"} Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.485147 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" event={"ID":"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3","Type":"ContainerStarted","Data":"ed13e4d0e96a9a590b68371f4ce0af3043bfdeaff584c0a9bdf33c91ee4ce087"} Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.487004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9bc20709-2b88-4657-930d-2f893754bc1b","Type":"ContainerStarted","Data":"f9a46df295b9504ecd451a63940fff7234f2b6a94c7406f27a604fbc84c81698"} Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.489574 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3b330794-936e-4249-a59f-5c68279f210d","Type":"ContainerStarted","Data":"0707583ce3524176a76a27b3ba5678da894972f75c990dbcdccd8d5c152f9e65"} Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.522379 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" event={"ID":"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3","Type":"ContainerStarted","Data":"397c511d63225d67b276c060e57fbd9133ca0d18599ab3f217f12da84129d486"} Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.524115 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.539903 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9bc20709-2b88-4657-930d-2f893754bc1b","Type":"ContainerStarted","Data":"53b04d3a57dbbb5062db7ebeb0b4c01cafe0b4c61a318513e4752f51a74b73e5"} Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.539948 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9bc20709-2b88-4657-930d-2f893754bc1b","Type":"ContainerStarted","Data":"113274e5db3ab3c7952100755601f5ca794e6a2311c6e4d33ee9cf0fe6058561"} Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.540009 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.544109 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e96ea8ef-858b-421c-9b80-c8e76e2bc368","Type":"ContainerStarted","Data":"f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f"} Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.624336 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" podStartSLOduration=3.622341691 podStartE2EDuration="3.622341691s" podCreationTimestamp="2026-03-18 09:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:41.572430294 +0000 UTC m=+3088.147175134" watchObservedRunningTime="2026-03-18 09:53:41.622341691 +0000 UTC m=+3088.197086531" Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.628668 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=2.628640441 podStartE2EDuration="2.628640441s" podCreationTimestamp="2026-03-18 09:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:41.614796976 +0000 UTC m=+3088.189541826" watchObservedRunningTime="2026-03-18 09:53:41.628640441 +0000 UTC m=+3088.203385281" Mar 18 09:53:42 crc kubenswrapper[4778]: I0318 09:53:42.074161 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:42 crc kubenswrapper[4778]: I0318 09:53:42.559392 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e96ea8ef-858b-421c-9b80-c8e76e2bc368","Type":"ContainerStarted","Data":"45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa"} Mar 18 09:53:42 crc kubenswrapper[4778]: I0318 09:53:42.575917 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.55866214 podStartE2EDuration="4.575898075s" podCreationTimestamp="2026-03-18 09:53:38 +0000 UTC" firstStartedPulling="2026-03-18 09:53:39.70339872 +0000 UTC m=+3086.278143560" lastFinishedPulling="2026-03-18 09:53:40.720634655 +0000 UTC m=+3087.295379495" observedRunningTime="2026-03-18 09:53:42.574482986 +0000 UTC m=+3089.149227826" watchObservedRunningTime="2026-03-18 09:53:42.575898075 +0000 UTC m=+3089.150642915" Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.574705 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api-log" containerID="cri-o://113274e5db3ab3c7952100755601f5ca794e6a2311c6e4d33ee9cf0fe6058561" gracePeriod=30 Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.574755 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api" containerID="cri-o://53b04d3a57dbbb5062db7ebeb0b4c01cafe0b4c61a318513e4752f51a74b73e5" gracePeriod=30 Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.685274 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.685599 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-central-agent" containerID="cri-o://72617aca4aa8400e7af4a60ff63ae0a42982c759ce5821a76119f425b2752b6a" gracePeriod=30 Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.685729 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="sg-core" containerID="cri-o://28ea8f018593b4e5ea693b13884510308b8fe6c9f56dc77f633042aa7b3c2142" gracePeriod=30 Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.685790 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="proxy-httpd" containerID="cri-o://345ccf529d04d20baa9e9e06ea792464e3dc120ee1042052fe45190d2b838b33" gracePeriod=30 Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.685826 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-notification-agent" containerID="cri-o://d0ab691768395396ab9cd2b17a7818f8101a3f17ea12d03f3697e6b6b5c5a0b8" gracePeriod=30 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.585501 4778 generic.go:334] "Generic (PLEG): container finished" podID="9bc20709-2b88-4657-930d-2f893754bc1b" containerID="53b04d3a57dbbb5062db7ebeb0b4c01cafe0b4c61a318513e4752f51a74b73e5" exitCode=0 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.586010 4778 generic.go:334] "Generic (PLEG): container finished" podID="9bc20709-2b88-4657-930d-2f893754bc1b" containerID="113274e5db3ab3c7952100755601f5ca794e6a2311c6e4d33ee9cf0fe6058561" exitCode=143 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.585573 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9bc20709-2b88-4657-930d-2f893754bc1b","Type":"ContainerDied","Data":"53b04d3a57dbbb5062db7ebeb0b4c01cafe0b4c61a318513e4752f51a74b73e5"} Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.586095 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9bc20709-2b88-4657-930d-2f893754bc1b","Type":"ContainerDied","Data":"113274e5db3ab3c7952100755601f5ca794e6a2311c6e4d33ee9cf0fe6058561"} Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592685 4778 generic.go:334] "Generic (PLEG): container finished" podID="d6f19125-b59e-49f9-8819-0cad52114840" containerID="345ccf529d04d20baa9e9e06ea792464e3dc120ee1042052fe45190d2b838b33" exitCode=0 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592716 4778 generic.go:334] "Generic (PLEG): container finished" podID="d6f19125-b59e-49f9-8819-0cad52114840" containerID="28ea8f018593b4e5ea693b13884510308b8fe6c9f56dc77f633042aa7b3c2142" exitCode=2 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592727 4778 generic.go:334] "Generic (PLEG): container finished" podID="d6f19125-b59e-49f9-8819-0cad52114840" containerID="d0ab691768395396ab9cd2b17a7818f8101a3f17ea12d03f3697e6b6b5c5a0b8" exitCode=0 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592738 4778 generic.go:334] "Generic (PLEG): container finished" podID="d6f19125-b59e-49f9-8819-0cad52114840" containerID="72617aca4aa8400e7af4a60ff63ae0a42982c759ce5821a76119f425b2752b6a" exitCode=0 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592761 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerDied","Data":"345ccf529d04d20baa9e9e06ea792464e3dc120ee1042052fe45190d2b838b33"} Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592802 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerDied","Data":"28ea8f018593b4e5ea693b13884510308b8fe6c9f56dc77f633042aa7b3c2142"} Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592815 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerDied","Data":"d0ab691768395396ab9cd2b17a7818f8101a3f17ea12d03f3697e6b6b5c5a0b8"} Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592825 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerDied","Data":"72617aca4aa8400e7af4a60ff63ae0a42982c759ce5821a76119f425b2752b6a"} Mar 18 09:53:46 crc kubenswrapper[4778]: I0318 09:53:46.980462 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.083547 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.101972 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k982x\" (UniqueName: \"kubernetes.io/projected/9bc20709-2b88-4657-930d-2f893754bc1b-kube-api-access-k982x\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.102626 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.102674 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data-custom\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.102721 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc20709-2b88-4657-930d-2f893754bc1b-logs\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.102919 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-scripts\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.104484 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bc20709-2b88-4657-930d-2f893754bc1b-logs" (OuterVolumeSpecName: "logs") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.106018 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-combined-ca-bundle\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.106053 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc20709-2b88-4657-930d-2f893754bc1b-etc-machine-id\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.106828 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc20709-2b88-4657-930d-2f893754bc1b-kube-api-access-k982x" (OuterVolumeSpecName: "kube-api-access-k982x") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "kube-api-access-k982x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.107572 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bc20709-2b88-4657-930d-2f893754bc1b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.107992 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc20709-2b88-4657-930d-2f893754bc1b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.108766 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc20709-2b88-4657-930d-2f893754bc1b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.108865 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k982x\" (UniqueName: \"kubernetes.io/projected/9bc20709-2b88-4657-930d-2f893754bc1b-kube-api-access-k982x\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.109262 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-scripts" (OuterVolumeSpecName: "scripts") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.118969 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.168709 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.178850 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data" (OuterVolumeSpecName: "config-data") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209626 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-run-httpd\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209710 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-sg-core-conf-yaml\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209749 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7799h\" (UniqueName: \"kubernetes.io/projected/d6f19125-b59e-49f9-8819-0cad52114840-kube-api-access-7799h\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209779 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-log-httpd\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209819 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-scripts\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209874 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-ceilometer-tls-certs\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209899 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-combined-ca-bundle\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209952 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-config-data\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209984 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.210390 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.210409 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.210419 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.210428 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.210437 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.211959 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.216065 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-scripts" (OuterVolumeSpecName: "scripts") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.218075 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f19125-b59e-49f9-8819-0cad52114840-kube-api-access-7799h" (OuterVolumeSpecName: "kube-api-access-7799h") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "kube-api-access-7799h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.239908 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.276966 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.291359 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.312030 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7799h\" (UniqueName: \"kubernetes.io/projected/d6f19125-b59e-49f9-8819-0cad52114840-kube-api-access-7799h\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.312070 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.312079 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.312088 4778 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.312098 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.312108 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.324334 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-config-data" (OuterVolumeSpecName: "config-data") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.415104 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.627125 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.627152 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerDied","Data":"3887f968114761a1654028b0a71448b233a4c1f413820ca59edf51160a07bebd"} Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.627666 4778 scope.go:117] "RemoveContainer" containerID="345ccf529d04d20baa9e9e06ea792464e3dc120ee1042052fe45190d2b838b33" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.630335 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9bc20709-2b88-4657-930d-2f893754bc1b","Type":"ContainerDied","Data":"f9a46df295b9504ecd451a63940fff7234f2b6a94c7406f27a604fbc84c81698"} Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.630447 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.632706 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3b330794-936e-4249-a59f-5c68279f210d","Type":"ContainerStarted","Data":"563583bf7624c87ff5f7909a2664699f2f91b9c14e7dca73ca0082935cc34469"} Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.655247 4778 scope.go:117] "RemoveContainer" containerID="28ea8f018593b4e5ea693b13884510308b8fe6c9f56dc77f633042aa7b3c2142" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.701280 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.732992 4778 scope.go:117] "RemoveContainer" containerID="d0ab691768395396ab9cd2b17a7818f8101a3f17ea12d03f3697e6b6b5c5a0b8" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.748265 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.779278 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.785103 4778 scope.go:117] "RemoveContainer" containerID="72617aca4aa8400e7af4a60ff63ae0a42982c759ce5821a76119f425b2752b6a" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.793396 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802130 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: E0318 09:53:47.802662 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="proxy-httpd" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802689 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="proxy-httpd" Mar 18 09:53:47 crc kubenswrapper[4778]: E0318 09:53:47.802720 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-notification-agent" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802731 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-notification-agent" Mar 18 09:53:47 crc kubenswrapper[4778]: E0318 09:53:47.802747 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="sg-core" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802755 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="sg-core" Mar 18 09:53:47 crc kubenswrapper[4778]: E0318 09:53:47.802768 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802776 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api" Mar 18 09:53:47 crc kubenswrapper[4778]: E0318 09:53:47.802807 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api-log" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802815 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api-log" Mar 18 09:53:47 crc kubenswrapper[4778]: E0318 09:53:47.802830 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-central-agent" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802838 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-central-agent" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.803074 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-notification-agent" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.803094 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-central-agent" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.803111 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.803125 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api-log" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.803140 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="sg-core" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.803150 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="proxy-httpd" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.804357 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.807845 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.808396 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.809597 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.809676 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.814927 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.816344 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.818806 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.819081 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.819288 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.822571 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.841175 4778 scope.go:117] "RemoveContainer" containerID="53b04d3a57dbbb5062db7ebeb0b4c01cafe0b4c61a318513e4752f51a74b73e5" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.871083 4778 scope.go:117] "RemoveContainer" containerID="113274e5db3ab3c7952100755601f5ca794e6a2311c6e4d33ee9cf0fe6058561" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931090 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-config-data\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931173 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-config-data\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931297 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/35adb68e-2fb0-437c-bea7-e46f05e4918c-etc-machine-id\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931420 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-log-httpd\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931560 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931628 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-config-data-custom\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931800 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-public-tls-certs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931849 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932000 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26w4b\" (UniqueName: \"kubernetes.io/projected/35adb68e-2fb0-437c-bea7-e46f05e4918c-kube-api-access-26w4b\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-run-httpd\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932132 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35adb68e-2fb0-437c-bea7-e46f05e4918c-logs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932232 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-scripts\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932372 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9scf\" (UniqueName: \"kubernetes.io/projected/65984d06-3c40-407a-b217-0df5cfedcd66-kube-api-access-h9scf\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932446 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-internal-tls-certs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932554 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932663 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932708 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-scripts\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034552 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26w4b\" (UniqueName: \"kubernetes.io/projected/35adb68e-2fb0-437c-bea7-e46f05e4918c-kube-api-access-26w4b\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034610 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-run-httpd\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034626 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35adb68e-2fb0-437c-bea7-e46f05e4918c-logs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034650 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-scripts\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034684 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9scf\" (UniqueName: \"kubernetes.io/projected/65984d06-3c40-407a-b217-0df5cfedcd66-kube-api-access-h9scf\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034711 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-internal-tls-certs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034740 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034784 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034803 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-scripts\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034819 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-config-data\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034833 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-config-data\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034851 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/35adb68e-2fb0-437c-bea7-e46f05e4918c-etc-machine-id\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034873 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-log-httpd\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034904 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034926 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-config-data-custom\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034965 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-public-tls-certs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034984 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.035840 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35adb68e-2fb0-437c-bea7-e46f05e4918c-logs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.035915 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/35adb68e-2fb0-437c-bea7-e46f05e4918c-etc-machine-id\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.037344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-log-httpd\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.037688 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-run-httpd\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.040402 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.040802 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.040997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.042552 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-internal-tls-certs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.044845 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-config-data\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.045492 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-scripts\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.047165 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-config-data\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.050498 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-public-tls-certs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.051261 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.052585 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-scripts\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.055401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26w4b\" (UniqueName: \"kubernetes.io/projected/35adb68e-2fb0-437c-bea7-e46f05e4918c-kube-api-access-26w4b\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.059809 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9scf\" (UniqueName: \"kubernetes.io/projected/65984d06-3c40-407a-b217-0df5cfedcd66-kube-api-access-h9scf\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.064052 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-config-data-custom\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.138371 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.157042 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.209043 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" path="/var/lib/kubelet/pods/9bc20709-2b88-4657-930d-2f893754bc1b/volumes" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.210431 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f19125-b59e-49f9-8819-0cad52114840" path="/var/lib/kubelet/pods/d6f19125-b59e-49f9-8819-0cad52114840/volumes" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.647515 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3b330794-936e-4249-a59f-5c68279f210d","Type":"ContainerStarted","Data":"f7b7a17c36a6b78e815b947bec74afda3bb87533ab77861252c363b1152b6474"} Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.733875 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.435959098 podStartE2EDuration="10.733854933s" podCreationTimestamp="2026-03-18 09:53:38 +0000 UTC" firstStartedPulling="2026-03-18 09:53:39.67132803 +0000 UTC m=+3086.246072870" lastFinishedPulling="2026-03-18 09:53:46.969223875 +0000 UTC m=+3093.543968705" observedRunningTime="2026-03-18 09:53:48.671982232 +0000 UTC m=+3095.246727082" watchObservedRunningTime="2026-03-18 09:53:48.733854933 +0000 UTC m=+3095.308599773" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.736619 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:48 crc kubenswrapper[4778]: W0318 09:53:48.738041 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35adb68e_2fb0_437c_bea7_e46f05e4918c.slice/crio-e0f26ed27b148995826401877fcd1d9442376ca16ac89a832cd20f9659b631e9 WatchSource:0}: Error finding container e0f26ed27b148995826401877fcd1d9442376ca16ac89a832cd20f9659b631e9: Status 404 returned error can't find the container with id e0f26ed27b148995826401877fcd1d9442376ca16ac89a832cd20f9659b631e9 Mar 18 09:53:48 crc kubenswrapper[4778]: W0318 09:53:48.761535 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65984d06_3c40_407a_b217_0df5cfedcd66.slice/crio-9a4f031251027580940fe7c126f24f661a0ee1e9d08f2164aa143d6e4499255a WatchSource:0}: Error finding container 9a4f031251027580940fe7c126f24f661a0ee1e9d08f2164aa143d6e4499255a: Status 404 returned error can't find the container with id 9a4f031251027580940fe7c126f24f661a0ee1e9d08f2164aa143d6e4499255a Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.766718 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.070293 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.081567 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.230799 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.361165 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-6qq8b"] Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.361719 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerName="dnsmasq-dns" containerID="cri-o://fd53655c1355ee9e239be226351d5c5537c6593f18ffcbefe53a7ddaf1ea2816" gracePeriod=10 Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.666670 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"35adb68e-2fb0-437c-bea7-e46f05e4918c","Type":"ContainerStarted","Data":"5da9ca9eaa10d047e8e888250aa3be62237c3adcd66ba4e1b6111a63774fe210"} Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.667012 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"35adb68e-2fb0-437c-bea7-e46f05e4918c","Type":"ContainerStarted","Data":"e0f26ed27b148995826401877fcd1d9442376ca16ac89a832cd20f9659b631e9"} Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.668813 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerStarted","Data":"82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102"} Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.668859 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerStarted","Data":"9a4f031251027580940fe7c126f24f661a0ee1e9d08f2164aa143d6e4499255a"} Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.671679 4778 generic.go:334] "Generic (PLEG): container finished" podID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerID="fd53655c1355ee9e239be226351d5c5537c6593f18ffcbefe53a7ddaf1ea2816" exitCode=0 Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.671752 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" event={"ID":"5778826c-0b71-4dad-af9c-c7ec7f04aa36","Type":"ContainerDied","Data":"fd53655c1355ee9e239be226351d5c5537c6593f18ffcbefe53a7ddaf1ea2816"} Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.869881 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.017330 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-dns-svc\") pod \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.017503 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-sb\") pod \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.017556 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2gn6\" (UniqueName: \"kubernetes.io/projected/5778826c-0b71-4dad-af9c-c7ec7f04aa36-kube-api-access-q2gn6\") pod \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.017586 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-config\") pod \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.017629 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-nb\") pod \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.017647 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-openstack-edpm-ipam\") pod \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.042457 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5778826c-0b71-4dad-af9c-c7ec7f04aa36-kube-api-access-q2gn6" (OuterVolumeSpecName: "kube-api-access-q2gn6") pod "5778826c-0b71-4dad-af9c-c7ec7f04aa36" (UID: "5778826c-0b71-4dad-af9c-c7ec7f04aa36"). InnerVolumeSpecName "kube-api-access-q2gn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.099623 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5778826c-0b71-4dad-af9c-c7ec7f04aa36" (UID: "5778826c-0b71-4dad-af9c-c7ec7f04aa36"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.102187 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "5778826c-0b71-4dad-af9c-c7ec7f04aa36" (UID: "5778826c-0b71-4dad-af9c-c7ec7f04aa36"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.116993 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-config" (OuterVolumeSpecName: "config") pod "5778826c-0b71-4dad-af9c-c7ec7f04aa36" (UID: "5778826c-0b71-4dad-af9c-c7ec7f04aa36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.120964 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2gn6\" (UniqueName: \"kubernetes.io/projected/5778826c-0b71-4dad-af9c-c7ec7f04aa36-kube-api-access-q2gn6\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.120997 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.121044 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.121058 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.134329 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5778826c-0b71-4dad-af9c-c7ec7f04aa36" (UID: "5778826c-0b71-4dad-af9c-c7ec7f04aa36"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.176973 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5778826c-0b71-4dad-af9c-c7ec7f04aa36" (UID: "5778826c-0b71-4dad-af9c-c7ec7f04aa36"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.223490 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.223645 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.681857 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerStarted","Data":"959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75"} Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.684098 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" event={"ID":"5778826c-0b71-4dad-af9c-c7ec7f04aa36","Type":"ContainerDied","Data":"c164a0a62f55fc2358e88475eedc2afcbd5a8934bf23dc30fb39c81fc6c685f2"} Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.684135 4778 scope.go:117] "RemoveContainer" containerID="fd53655c1355ee9e239be226351d5c5537c6593f18ffcbefe53a7ddaf1ea2816" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.684261 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.688304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"35adb68e-2fb0-437c-bea7-e46f05e4918c","Type":"ContainerStarted","Data":"e61e45abcd65218969ff9e52928a7efe4dc2f816fcc6afa936b3d4a046157f25"} Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.688632 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.710265 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-6qq8b"] Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.722154 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-6qq8b"] Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.737553 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.737535334 podStartE2EDuration="3.737535334s" podCreationTimestamp="2026-03-18 09:53:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:50.724377657 +0000 UTC m=+3097.299122517" watchObservedRunningTime="2026-03-18 09:53:50.737535334 +0000 UTC m=+3097.312280174" Mar 18 09:53:51 crc kubenswrapper[4778]: I0318 09:53:51.175233 4778 scope.go:117] "RemoveContainer" containerID="959408f0646af1976d94cc538a0b42d120c9b802bebf63b681de404e7a6632a0" Mar 18 09:53:52 crc kubenswrapper[4778]: I0318 09:53:52.199644 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" path="/var/lib/kubelet/pods/5778826c-0b71-4dad-af9c-c7ec7f04aa36/volumes" Mar 18 09:53:52 crc kubenswrapper[4778]: I0318 09:53:52.720991 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerStarted","Data":"cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f"} Mar 18 09:53:53 crc kubenswrapper[4778]: I0318 09:53:53.351444 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.199520 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:53:54 crc kubenswrapper[4778]: E0318 09:53:54.200509 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.742826 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerStarted","Data":"7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106"} Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.743049 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-central-agent" containerID="cri-o://82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102" gracePeriod=30 Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.743123 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.743174 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="sg-core" containerID="cri-o://cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f" gracePeriod=30 Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.743223 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-notification-agent" containerID="cri-o://959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75" gracePeriod=30 Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.743163 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="proxy-httpd" containerID="cri-o://7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106" gracePeriod=30 Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.772743 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9765642420000002 podStartE2EDuration="7.772724384s" podCreationTimestamp="2026-03-18 09:53:47 +0000 UTC" firstStartedPulling="2026-03-18 09:53:48.764266279 +0000 UTC m=+3095.339011119" lastFinishedPulling="2026-03-18 09:53:53.560426421 +0000 UTC m=+3100.135171261" observedRunningTime="2026-03-18 09:53:54.770589757 +0000 UTC m=+3101.345334597" watchObservedRunningTime="2026-03-18 09:53:54.772724384 +0000 UTC m=+3101.347469224" Mar 18 09:53:55 crc kubenswrapper[4778]: I0318 09:53:55.766624 4778 generic.go:334] "Generic (PLEG): container finished" podID="65984d06-3c40-407a-b217-0df5cfedcd66" containerID="7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106" exitCode=0 Mar 18 09:53:55 crc kubenswrapper[4778]: I0318 09:53:55.766995 4778 generic.go:334] "Generic (PLEG): container finished" podID="65984d06-3c40-407a-b217-0df5cfedcd66" containerID="cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f" exitCode=2 Mar 18 09:53:55 crc kubenswrapper[4778]: I0318 09:53:55.767014 4778 generic.go:334] "Generic (PLEG): container finished" podID="65984d06-3c40-407a-b217-0df5cfedcd66" containerID="959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75" exitCode=0 Mar 18 09:53:55 crc kubenswrapper[4778]: I0318 09:53:55.766717 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerDied","Data":"7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106"} Mar 18 09:53:55 crc kubenswrapper[4778]: I0318 09:53:55.767071 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerDied","Data":"cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f"} Mar 18 09:53:55 crc kubenswrapper[4778]: I0318 09:53:55.767096 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerDied","Data":"959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75"} Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.120450 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.155563 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-ceilometer-tls-certs\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.155800 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-config-data\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.155913 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-scripts\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.156008 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-sg-core-conf-yaml\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.156097 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-run-httpd\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.156212 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-log-httpd\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.156442 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9scf\" (UniqueName: \"kubernetes.io/projected/65984d06-3c40-407a-b217-0df5cfedcd66-kube-api-access-h9scf\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.156557 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-combined-ca-bundle\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.157337 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.157352 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.183220 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65984d06-3c40-407a-b217-0df5cfedcd66-kube-api-access-h9scf" (OuterVolumeSpecName: "kube-api-access-h9scf") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "kube-api-access-h9scf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.197871 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-scripts" (OuterVolumeSpecName: "scripts") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.211584 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.228624 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.258864 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9scf\" (UniqueName: \"kubernetes.io/projected/65984d06-3c40-407a-b217-0df5cfedcd66-kube-api-access-h9scf\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.258896 4778 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.258905 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.258914 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.258923 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.258931 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.265171 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.281414 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-config-data" (OuterVolumeSpecName: "config-data") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.360427 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.360711 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.776860 4778 generic.go:334] "Generic (PLEG): container finished" podID="65984d06-3c40-407a-b217-0df5cfedcd66" containerID="82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102" exitCode=0 Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.776902 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerDied","Data":"82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102"} Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.776936 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerDied","Data":"9a4f031251027580940fe7c126f24f661a0ee1e9d08f2164aa143d6e4499255a"} Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.776954 4778 scope.go:117] "RemoveContainer" containerID="7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.778046 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.805633 4778 scope.go:117] "RemoveContainer" containerID="cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.832257 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.857034 4778 scope.go:117] "RemoveContainer" containerID="959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.869360 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.880509 4778 scope.go:117] "RemoveContainer" containerID="82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883370 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.883766 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-central-agent" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883782 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-central-agent" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.883798 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="sg-core" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883805 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="sg-core" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.883830 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerName="init" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883836 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerName="init" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.883843 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-notification-agent" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883849 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-notification-agent" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.883863 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerName="dnsmasq-dns" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883868 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerName="dnsmasq-dns" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.883882 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="proxy-httpd" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883888 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="proxy-httpd" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.884051 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="sg-core" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.884066 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-notification-agent" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.884075 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="proxy-httpd" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.884087 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-central-agent" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.884099 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerName="dnsmasq-dns" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.885785 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.888450 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.888986 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.896082 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.896677 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.919454 4778 scope.go:117] "RemoveContainer" containerID="7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.921311 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106\": container with ID starting with 7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106 not found: ID does not exist" containerID="7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.921420 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106"} err="failed to get container status \"7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106\": rpc error: code = NotFound desc = could not find container \"7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106\": container with ID starting with 7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106 not found: ID does not exist" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.921499 4778 scope.go:117] "RemoveContainer" containerID="cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.922034 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f\": container with ID starting with cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f not found: ID does not exist" containerID="cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.922150 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f"} err="failed to get container status \"cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f\": rpc error: code = NotFound desc = could not find container \"cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f\": container with ID starting with cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f not found: ID does not exist" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.922380 4778 scope.go:117] "RemoveContainer" containerID="959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.922694 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75\": container with ID starting with 959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75 not found: ID does not exist" containerID="959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.922778 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75"} err="failed to get container status \"959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75\": rpc error: code = NotFound desc = could not find container \"959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75\": container with ID starting with 959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75 not found: ID does not exist" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.922848 4778 scope.go:117] "RemoveContainer" containerID="82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.923118 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102\": container with ID starting with 82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102 not found: ID does not exist" containerID="82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.923229 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102"} err="failed to get container status \"82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102\": rpc error: code = NotFound desc = could not find container \"82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102\": container with ID starting with 82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102 not found: ID does not exist" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969147 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52bc493f-72e9-4387-9b91-13343fc7d550-log-httpd\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969176 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969309 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52bc493f-72e9-4387-9b91-13343fc7d550-run-httpd\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969342 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-scripts\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969397 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969430 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-config-data\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969461 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwdff\" (UniqueName: \"kubernetes.io/projected/52bc493f-72e9-4387-9b91-13343fc7d550-kube-api-access-lwdff\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071360 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071422 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-config-data\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071458 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwdff\" (UniqueName: \"kubernetes.io/projected/52bc493f-72e9-4387-9b91-13343fc7d550-kube-api-access-lwdff\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071512 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071551 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52bc493f-72e9-4387-9b91-13343fc7d550-log-httpd\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071578 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071661 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52bc493f-72e9-4387-9b91-13343fc7d550-run-httpd\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071693 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-scripts\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.072399 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52bc493f-72e9-4387-9b91-13343fc7d550-log-httpd\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.072426 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52bc493f-72e9-4387-9b91-13343fc7d550-run-httpd\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.077419 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.079692 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.080185 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-scripts\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.080328 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.088467 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-config-data\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.092020 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwdff\" (UniqueName: \"kubernetes.io/projected/52bc493f-72e9-4387-9b91-13343fc7d550-kube-api-access-lwdff\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.206642 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.700643 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:57 crc kubenswrapper[4778]: W0318 09:53:57.717277 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52bc493f_72e9_4387_9b91_13343fc7d550.slice/crio-4377686f9d70c59c5780a5e8b27cbaa293811031de0dc9240ed95d1fd84a4809 WatchSource:0}: Error finding container 4377686f9d70c59c5780a5e8b27cbaa293811031de0dc9240ed95d1fd84a4809: Status 404 returned error can't find the container with id 4377686f9d70c59c5780a5e8b27cbaa293811031de0dc9240ed95d1fd84a4809 Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.720118 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.787704 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52bc493f-72e9-4387-9b91-13343fc7d550","Type":"ContainerStarted","Data":"4377686f9d70c59c5780a5e8b27cbaa293811031de0dc9240ed95d1fd84a4809"} Mar 18 09:53:58 crc kubenswrapper[4778]: I0318 09:53:58.199920 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" path="/var/lib/kubelet/pods/65984d06-3c40-407a-b217-0df5cfedcd66/volumes" Mar 18 09:53:58 crc kubenswrapper[4778]: I0318 09:53:58.825503 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52bc493f-72e9-4387-9b91-13343fc7d550","Type":"ContainerStarted","Data":"9231b7e8988076a1286ece2bdfd759ab5abea7b27b2046e1f79083b6d9028063"} Mar 18 09:53:59 crc kubenswrapper[4778]: I0318 09:53:59.845423 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52bc493f-72e9-4387-9b91-13343fc7d550","Type":"ContainerStarted","Data":"4b72b1f86dd6b64423280129d67be8592ca5209a7d043c97a2057d96e1a00d59"} Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.149396 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563794-586xm"] Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.154828 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.158534 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.160461 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.162624 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563794-586xm"] Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.164681 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.246316 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmdw\" (UniqueName: \"kubernetes.io/projected/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3-kube-api-access-cjmdw\") pod \"auto-csr-approver-29563794-586xm\" (UID: \"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3\") " pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.347918 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmdw\" (UniqueName: \"kubernetes.io/projected/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3-kube-api-access-cjmdw\") pod \"auto-csr-approver-29563794-586xm\" (UID: \"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3\") " pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.368207 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmdw\" (UniqueName: \"kubernetes.io/projected/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3-kube-api-access-cjmdw\") pod \"auto-csr-approver-29563794-586xm\" (UID: \"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3\") " pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.473698 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.681180 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.696381 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.750297 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.792001 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.858366 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="manila-share" containerID="cri-o://563583bf7624c87ff5f7909a2664699f2f91b9c14e7dca73ca0082935cc34469" gracePeriod=30 Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.858935 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="probe" containerID="cri-o://f7b7a17c36a6b78e815b947bec74afda3bb87533ab77861252c363b1152b6474" gracePeriod=30 Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.859032 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52bc493f-72e9-4387-9b91-13343fc7d550","Type":"ContainerStarted","Data":"901bb6ee88851abc816df355546ae1f36ba920b0ededf032f2250d52cb67dc93"} Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.859282 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="manila-scheduler" containerID="cri-o://f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f" gracePeriod=30 Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.859434 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="probe" containerID="cri-o://45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa" gracePeriod=30 Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.961850 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563794-586xm"] Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.881173 4778 generic.go:334] "Generic (PLEG): container finished" podID="3b330794-936e-4249-a59f-5c68279f210d" containerID="f7b7a17c36a6b78e815b947bec74afda3bb87533ab77861252c363b1152b6474" exitCode=0 Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.882033 4778 generic.go:334] "Generic (PLEG): container finished" podID="3b330794-936e-4249-a59f-5c68279f210d" containerID="563583bf7624c87ff5f7909a2664699f2f91b9c14e7dca73ca0082935cc34469" exitCode=1 Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.881233 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3b330794-936e-4249-a59f-5c68279f210d","Type":"ContainerDied","Data":"f7b7a17c36a6b78e815b947bec74afda3bb87533ab77861252c363b1152b6474"} Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.882094 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3b330794-936e-4249-a59f-5c68279f210d","Type":"ContainerDied","Data":"563583bf7624c87ff5f7909a2664699f2f91b9c14e7dca73ca0082935cc34469"} Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.883647 4778 generic.go:334] "Generic (PLEG): container finished" podID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerID="45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa" exitCode=0 Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.883689 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e96ea8ef-858b-421c-9b80-c8e76e2bc368","Type":"ContainerDied","Data":"45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa"} Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.884353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563794-586xm" event={"ID":"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3","Type":"ContainerStarted","Data":"2ff6ff811698c9908d7bd73b0e4fa290162f5b8af6b3b1b1a64cc943bb4b1185"} Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.989366 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.084894 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-combined-ca-bundle\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.084975 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-scripts\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.085086 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.085788 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-var-lib-manila\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.085931 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-etc-machine-id\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.085956 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frmhc\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-kube-api-access-frmhc\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.086042 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-ceph\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.086060 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data-custom\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.086337 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.086601 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.086807 4778 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.086825 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.093614 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-scripts" (OuterVolumeSpecName: "scripts") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.093679 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-ceph" (OuterVolumeSpecName: "ceph") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.096178 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-kube-api-access-frmhc" (OuterVolumeSpecName: "kube-api-access-frmhc") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "kube-api-access-frmhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.107097 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.144912 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.188599 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frmhc\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-kube-api-access-frmhc\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.188633 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.188646 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.188654 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.188665 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.197446 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data" (OuterVolumeSpecName: "config-data") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.291013 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.895710 4778 generic.go:334] "Generic (PLEG): container finished" podID="c681ab9e-5bfe-4e10-9154-41ff0c5d76a3" containerID="1887b38177a6dc3b69f09e9dc6a6dd26a61cf63c5f532cbb5b0e04e1fb5a3d8b" exitCode=0 Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.895788 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563794-586xm" event={"ID":"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3","Type":"ContainerDied","Data":"1887b38177a6dc3b69f09e9dc6a6dd26a61cf63c5f532cbb5b0e04e1fb5a3d8b"} Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.900666 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3b330794-936e-4249-a59f-5c68279f210d","Type":"ContainerDied","Data":"0707583ce3524176a76a27b3ba5678da894972f75c990dbcdccd8d5c152f9e65"} Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.900721 4778 scope.go:117] "RemoveContainer" containerID="f7b7a17c36a6b78e815b947bec74afda3bb87533ab77861252c363b1152b6474" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.900882 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.904437 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52bc493f-72e9-4387-9b91-13343fc7d550","Type":"ContainerStarted","Data":"23ac028505c473e713082990956c9259c6dd33bd7cddc2fa1c7586ba732d4f84"} Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.904625 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.930944 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.853640727 podStartE2EDuration="6.93092168s" podCreationTimestamp="2026-03-18 09:53:56 +0000 UTC" firstStartedPulling="2026-03-18 09:53:57.719939519 +0000 UTC m=+3104.294684349" lastFinishedPulling="2026-03-18 09:54:01.797220462 +0000 UTC m=+3108.371965302" observedRunningTime="2026-03-18 09:54:02.930800887 +0000 UTC m=+3109.505545747" watchObservedRunningTime="2026-03-18 09:54:02.93092168 +0000 UTC m=+3109.505666530" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.951765 4778 scope.go:117] "RemoveContainer" containerID="563583bf7624c87ff5f7909a2664699f2f91b9c14e7dca73ca0082935cc34469" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.960167 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.972654 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.998330 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:54:02 crc kubenswrapper[4778]: E0318 09:54:02.998834 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="probe" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.998854 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="probe" Mar 18 09:54:02 crc kubenswrapper[4778]: E0318 09:54:02.998892 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="manila-share" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.998913 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="manila-share" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.999117 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="manila-share" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.999140 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="probe" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.001936 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.010888 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.013523 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.129716 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130181 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-config-data\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130276 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-scripts\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130392 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130429 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/821dda0e-cde2-45a4-b23a-3d13565be515-ceph\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130531 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/821dda0e-cde2-45a4-b23a-3d13565be515-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130614 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/821dda0e-cde2-45a4-b23a-3d13565be515-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130650 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fddh\" (UniqueName: \"kubernetes.io/projected/821dda0e-cde2-45a4-b23a-3d13565be515-kube-api-access-8fddh\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232516 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232577 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/821dda0e-cde2-45a4-b23a-3d13565be515-ceph\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232619 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/821dda0e-cde2-45a4-b23a-3d13565be515-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232669 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/821dda0e-cde2-45a4-b23a-3d13565be515-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232699 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fddh\" (UniqueName: \"kubernetes.io/projected/821dda0e-cde2-45a4-b23a-3d13565be515-kube-api-access-8fddh\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232780 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232837 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/821dda0e-cde2-45a4-b23a-3d13565be515-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-config-data\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232888 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/821dda0e-cde2-45a4-b23a-3d13565be515-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232928 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-scripts\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.238507 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.239321 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/821dda0e-cde2-45a4-b23a-3d13565be515-ceph\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.250880 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-scripts\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.251000 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.251612 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-config-data\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.253690 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fddh\" (UniqueName: \"kubernetes.io/projected/821dda0e-cde2-45a4-b23a-3d13565be515-kube-api-access-8fddh\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.347615 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.903855 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.928341 4778 generic.go:334] "Generic (PLEG): container finished" podID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerID="f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f" exitCode=0 Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.928787 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e96ea8ef-858b-421c-9b80-c8e76e2bc368","Type":"ContainerDied","Data":"f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f"} Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.928806 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.928824 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e96ea8ef-858b-421c-9b80-c8e76e2bc368","Type":"ContainerDied","Data":"0b6591d1b2d8334e5cf9a42b169e44b6dfae4bb6ceaf223f510fc2ac1f46c6e1"} Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.928846 4778 scope.go:117] "RemoveContainer" containerID="45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.953702 4778 scope.go:117] "RemoveContainer" containerID="f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.974049 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.007119 4778 scope.go:117] "RemoveContainer" containerID="45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa" Mar 18 09:54:04 crc kubenswrapper[4778]: E0318 09:54:04.008378 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa\": container with ID starting with 45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa not found: ID does not exist" containerID="45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.008427 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa"} err="failed to get container status \"45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa\": rpc error: code = NotFound desc = could not find container \"45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa\": container with ID starting with 45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa not found: ID does not exist" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.008458 4778 scope.go:117] "RemoveContainer" containerID="f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f" Mar 18 09:54:04 crc kubenswrapper[4778]: E0318 09:54:04.009316 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f\": container with ID starting with f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f not found: ID does not exist" containerID="f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.009351 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f"} err="failed to get container status \"f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f\": rpc error: code = NotFound desc = could not find container \"f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f\": container with ID starting with f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f not found: ID does not exist" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.045572 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data\") pod \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.045710 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-combined-ca-bundle\") pod \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.045782 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data-custom\") pod \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.045986 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-scripts\") pod \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.046062 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96ea8ef-858b-421c-9b80-c8e76e2bc368-etc-machine-id\") pod \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.046132 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm9bt\" (UniqueName: \"kubernetes.io/projected/e96ea8ef-858b-421c-9b80-c8e76e2bc368-kube-api-access-qm9bt\") pod \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.046463 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e96ea8ef-858b-421c-9b80-c8e76e2bc368-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e96ea8ef-858b-421c-9b80-c8e76e2bc368" (UID: "e96ea8ef-858b-421c-9b80-c8e76e2bc368"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.047470 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96ea8ef-858b-421c-9b80-c8e76e2bc368-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.055721 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-scripts" (OuterVolumeSpecName: "scripts") pod "e96ea8ef-858b-421c-9b80-c8e76e2bc368" (UID: "e96ea8ef-858b-421c-9b80-c8e76e2bc368"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.055896 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e96ea8ef-858b-421c-9b80-c8e76e2bc368-kube-api-access-qm9bt" (OuterVolumeSpecName: "kube-api-access-qm9bt") pod "e96ea8ef-858b-421c-9b80-c8e76e2bc368" (UID: "e96ea8ef-858b-421c-9b80-c8e76e2bc368"). InnerVolumeSpecName "kube-api-access-qm9bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.057438 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e96ea8ef-858b-421c-9b80-c8e76e2bc368" (UID: "e96ea8ef-858b-421c-9b80-c8e76e2bc368"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.109169 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e96ea8ef-858b-421c-9b80-c8e76e2bc368" (UID: "e96ea8ef-858b-421c-9b80-c8e76e2bc368"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.149851 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.149890 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.149900 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.149909 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm9bt\" (UniqueName: \"kubernetes.io/projected/e96ea8ef-858b-421c-9b80-c8e76e2bc368-kube-api-access-qm9bt\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.163856 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data" (OuterVolumeSpecName: "config-data") pod "e96ea8ef-858b-421c-9b80-c8e76e2bc368" (UID: "e96ea8ef-858b-421c-9b80-c8e76e2bc368"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.179705 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.203715 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b330794-936e-4249-a59f-5c68279f210d" path="/var/lib/kubelet/pods/3b330794-936e-4249-a59f-5c68279f210d/volumes" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.255969 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjmdw\" (UniqueName: \"kubernetes.io/projected/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3-kube-api-access-cjmdw\") pod \"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3\" (UID: \"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.256479 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.267043 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3-kube-api-access-cjmdw" (OuterVolumeSpecName: "kube-api-access-cjmdw") pod "c681ab9e-5bfe-4e10-9154-41ff0c5d76a3" (UID: "c681ab9e-5bfe-4e10-9154-41ff0c5d76a3"). InnerVolumeSpecName "kube-api-access-cjmdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.289835 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.306254 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.323483 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:54:04 crc kubenswrapper[4778]: E0318 09:54:04.323972 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c681ab9e-5bfe-4e10-9154-41ff0c5d76a3" containerName="oc" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.323990 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c681ab9e-5bfe-4e10-9154-41ff0c5d76a3" containerName="oc" Mar 18 09:54:04 crc kubenswrapper[4778]: E0318 09:54:04.324018 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="manila-scheduler" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.324025 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="manila-scheduler" Mar 18 09:54:04 crc kubenswrapper[4778]: E0318 09:54:04.324044 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="probe" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.324050 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="probe" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.324236 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="probe" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.324249 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c681ab9e-5bfe-4e10-9154-41ff0c5d76a3" containerName="oc" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.324268 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="manila-scheduler" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.325284 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.327029 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.333738 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.358366 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjmdw\" (UniqueName: \"kubernetes.io/projected/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3-kube-api-access-cjmdw\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.459773 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rfzk\" (UniqueName: \"kubernetes.io/projected/e9af702d-3a1a-490e-82f5-e99c1718ef83-kube-api-access-9rfzk\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.459814 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-config-data\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.459852 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.460025 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.460075 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-scripts\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.460171 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9af702d-3a1a-490e-82f5-e99c1718ef83-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.561739 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.561807 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-scripts\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.561880 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9af702d-3a1a-490e-82f5-e99c1718ef83-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.562174 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9af702d-3a1a-490e-82f5-e99c1718ef83-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.563395 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rfzk\" (UniqueName: \"kubernetes.io/projected/e9af702d-3a1a-490e-82f5-e99c1718ef83-kube-api-access-9rfzk\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.563453 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-config-data\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.563527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.566937 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.567659 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-scripts\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.567997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-config-data\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.570020 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.578461 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rfzk\" (UniqueName: \"kubernetes.io/projected/e9af702d-3a1a-490e-82f5-e99c1718ef83-kube-api-access-9rfzk\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.648653 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.945754 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"821dda0e-cde2-45a4-b23a-3d13565be515","Type":"ContainerStarted","Data":"f8b2c3f2e882c8abefe1e9ec213d9e0ef5d62840b580d9e9b2da69d98ca7862e"} Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.946135 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"821dda0e-cde2-45a4-b23a-3d13565be515","Type":"ContainerStarted","Data":"13a4a6443261acc0962867a8d25652877daad9499fc8b57c1197b030736506ad"} Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.960847 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563794-586xm" event={"ID":"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3","Type":"ContainerDied","Data":"2ff6ff811698c9908d7bd73b0e4fa290162f5b8af6b3b1b1a64cc943bb4b1185"} Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.960896 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ff6ff811698c9908d7bd73b0e4fa290162f5b8af6b3b1b1a64cc943bb4b1185" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.960899 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:05 crc kubenswrapper[4778]: I0318 09:54:05.085954 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:54:05 crc kubenswrapper[4778]: I0318 09:54:05.262664 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563788-pctk8"] Mar 18 09:54:05 crc kubenswrapper[4778]: I0318 09:54:05.277140 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563788-pctk8"] Mar 18 09:54:05 crc kubenswrapper[4778]: I0318 09:54:05.974370 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"821dda0e-cde2-45a4-b23a-3d13565be515","Type":"ContainerStarted","Data":"005537f2c20835e85e19ce1f6f5d8263c3e3bedced6cc69c20f843c7e57240f4"} Mar 18 09:54:05 crc kubenswrapper[4778]: I0318 09:54:05.977309 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e9af702d-3a1a-490e-82f5-e99c1718ef83","Type":"ContainerStarted","Data":"724343d6dafc84c84a20658930d32f9fff855b3c1742e0f3e69c272f180b5e91"} Mar 18 09:54:05 crc kubenswrapper[4778]: I0318 09:54:05.977365 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e9af702d-3a1a-490e-82f5-e99c1718ef83","Type":"ContainerStarted","Data":"4fb077966a80a7907254d618f7b92ea4687f9b732c468fd32aab1668fb8f0977"} Mar 18 09:54:06 crc kubenswrapper[4778]: I0318 09:54:06.006930 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.006909873 podStartE2EDuration="4.006909873s" podCreationTimestamp="2026-03-18 09:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:54:05.995627726 +0000 UTC m=+3112.570372576" watchObservedRunningTime="2026-03-18 09:54:06.006909873 +0000 UTC m=+3112.581654713" Mar 18 09:54:06 crc kubenswrapper[4778]: I0318 09:54:06.202507 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc64d6e3-ed19-4365-ab83-8c1af026054b" path="/var/lib/kubelet/pods/dc64d6e3-ed19-4365-ab83-8c1af026054b/volumes" Mar 18 09:54:06 crc kubenswrapper[4778]: I0318 09:54:06.203497 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" path="/var/lib/kubelet/pods/e96ea8ef-858b-421c-9b80-c8e76e2bc368/volumes" Mar 18 09:54:06 crc kubenswrapper[4778]: I0318 09:54:06.988273 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e9af702d-3a1a-490e-82f5-e99c1718ef83","Type":"ContainerStarted","Data":"1a768cf698887e93db294d8cec66875bde6dcd3ae27a64b772641d4061d0d170"} Mar 18 09:54:07 crc kubenswrapper[4778]: I0318 09:54:07.023510 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.023493439 podStartE2EDuration="3.023493439s" podCreationTimestamp="2026-03-18 09:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:54:07.012134491 +0000 UTC m=+3113.586879341" watchObservedRunningTime="2026-03-18 09:54:07.023493439 +0000 UTC m=+3113.598238279" Mar 18 09:54:07 crc kubenswrapper[4778]: I0318 09:54:07.187407 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:54:07 crc kubenswrapper[4778]: E0318 09:54:07.187651 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:54:09 crc kubenswrapper[4778]: I0318 09:54:09.453090 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 18 09:54:13 crc kubenswrapper[4778]: I0318 09:54:13.348150 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 18 09:54:14 crc kubenswrapper[4778]: I0318 09:54:14.649314 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 18 09:54:15 crc kubenswrapper[4778]: I0318 09:54:15.910524 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.202:3000/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:54:18 crc kubenswrapper[4778]: I0318 09:54:18.187234 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:54:18 crc kubenswrapper[4778]: E0318 09:54:18.187803 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:54:25 crc kubenswrapper[4778]: I0318 09:54:25.042273 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 18 09:54:26 crc kubenswrapper[4778]: I0318 09:54:26.131934 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 18 09:54:27 crc kubenswrapper[4778]: I0318 09:54:27.221031 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 09:54:32 crc kubenswrapper[4778]: I0318 09:54:32.187628 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:54:32 crc kubenswrapper[4778]: E0318 09:54:32.188511 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:54:33 crc kubenswrapper[4778]: I0318 09:54:33.006659 4778 scope.go:117] "RemoveContainer" containerID="12a11cfc29f7b65306c1684f9c90110c5f5f19bee2195c78cf1dbf6c7f4120dd" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.252278 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wkp9p"] Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.254416 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.276613 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wkp9p"] Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.315956 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-catalog-content\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.316029 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-utilities\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.316126 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbfl6\" (UniqueName: \"kubernetes.io/projected/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-kube-api-access-bbfl6\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.418400 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-catalog-content\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.418480 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-utilities\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.418586 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbfl6\" (UniqueName: \"kubernetes.io/projected/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-kube-api-access-bbfl6\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.419235 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-catalog-content\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.419451 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-utilities\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.442068 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbfl6\" (UniqueName: \"kubernetes.io/projected/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-kube-api-access-bbfl6\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.620073 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:35 crc kubenswrapper[4778]: I0318 09:54:35.103855 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wkp9p"] Mar 18 09:54:35 crc kubenswrapper[4778]: I0318 09:54:35.325995 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerStarted","Data":"955fb0461de04ec66a7e60e5a7e268421dad9143ca3957a537dd0edcadbc1fc7"} Mar 18 09:54:36 crc kubenswrapper[4778]: I0318 09:54:36.339313 4778 generic.go:334] "Generic (PLEG): container finished" podID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerID="18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90" exitCode=0 Mar 18 09:54:36 crc kubenswrapper[4778]: I0318 09:54:36.339447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerDied","Data":"18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90"} Mar 18 09:54:37 crc kubenswrapper[4778]: I0318 09:54:37.351191 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerStarted","Data":"5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8"} Mar 18 09:54:38 crc kubenswrapper[4778]: I0318 09:54:38.364087 4778 generic.go:334] "Generic (PLEG): container finished" podID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerID="5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8" exitCode=0 Mar 18 09:54:38 crc kubenswrapper[4778]: I0318 09:54:38.364459 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerDied","Data":"5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8"} Mar 18 09:54:39 crc kubenswrapper[4778]: I0318 09:54:39.379316 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerStarted","Data":"92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72"} Mar 18 09:54:44 crc kubenswrapper[4778]: I0318 09:54:44.620243 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:44 crc kubenswrapper[4778]: I0318 09:54:44.620597 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:44 crc kubenswrapper[4778]: I0318 09:54:44.690430 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:44 crc kubenswrapper[4778]: I0318 09:54:44.717175 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wkp9p" podStartSLOduration=8.29297137 podStartE2EDuration="10.717152516s" podCreationTimestamp="2026-03-18 09:54:34 +0000 UTC" firstStartedPulling="2026-03-18 09:54:36.342566561 +0000 UTC m=+3142.917311411" lastFinishedPulling="2026-03-18 09:54:38.766747717 +0000 UTC m=+3145.341492557" observedRunningTime="2026-03-18 09:54:39.403921986 +0000 UTC m=+3145.978666856" watchObservedRunningTime="2026-03-18 09:54:44.717152516 +0000 UTC m=+3151.291897376" Mar 18 09:54:45 crc kubenswrapper[4778]: I0318 09:54:45.527288 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:45 crc kubenswrapper[4778]: I0318 09:54:45.580667 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wkp9p"] Mar 18 09:54:47 crc kubenswrapper[4778]: I0318 09:54:47.187150 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:54:47 crc kubenswrapper[4778]: E0318 09:54:47.187633 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:54:47 crc kubenswrapper[4778]: I0318 09:54:47.485803 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wkp9p" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="registry-server" containerID="cri-o://92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72" gracePeriod=2 Mar 18 09:54:47 crc kubenswrapper[4778]: I0318 09:54:47.998751 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.189507 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-catalog-content\") pod \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.189598 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbfl6\" (UniqueName: \"kubernetes.io/projected/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-kube-api-access-bbfl6\") pod \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.189668 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-utilities\") pod \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.190999 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-utilities" (OuterVolumeSpecName: "utilities") pod "21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" (UID: "21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.202686 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-kube-api-access-bbfl6" (OuterVolumeSpecName: "kube-api-access-bbfl6") pod "21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" (UID: "21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e"). InnerVolumeSpecName "kube-api-access-bbfl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.240443 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" (UID: "21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.293376 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.293487 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbfl6\" (UniqueName: \"kubernetes.io/projected/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-kube-api-access-bbfl6\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.293509 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.497862 4778 generic.go:334] "Generic (PLEG): container finished" podID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerID="92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72" exitCode=0 Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.497906 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerDied","Data":"92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72"} Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.497946 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerDied","Data":"955fb0461de04ec66a7e60e5a7e268421dad9143ca3957a537dd0edcadbc1fc7"} Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.497970 4778 scope.go:117] "RemoveContainer" containerID="92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.497990 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.544003 4778 scope.go:117] "RemoveContainer" containerID="5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.559038 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wkp9p"] Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.578096 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wkp9p"] Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.586470 4778 scope.go:117] "RemoveContainer" containerID="18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.616760 4778 scope.go:117] "RemoveContainer" containerID="92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72" Mar 18 09:54:48 crc kubenswrapper[4778]: E0318 09:54:48.617176 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72\": container with ID starting with 92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72 not found: ID does not exist" containerID="92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.617233 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72"} err="failed to get container status \"92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72\": rpc error: code = NotFound desc = could not find container \"92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72\": container with ID starting with 92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72 not found: ID does not exist" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.617259 4778 scope.go:117] "RemoveContainer" containerID="5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8" Mar 18 09:54:48 crc kubenswrapper[4778]: E0318 09:54:48.617572 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8\": container with ID starting with 5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8 not found: ID does not exist" containerID="5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.617610 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8"} err="failed to get container status \"5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8\": rpc error: code = NotFound desc = could not find container \"5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8\": container with ID starting with 5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8 not found: ID does not exist" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.617630 4778 scope.go:117] "RemoveContainer" containerID="18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90" Mar 18 09:54:48 crc kubenswrapper[4778]: E0318 09:54:48.617919 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90\": container with ID starting with 18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90 not found: ID does not exist" containerID="18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.617962 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90"} err="failed to get container status \"18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90\": rpc error: code = NotFound desc = could not find container \"18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90\": container with ID starting with 18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90 not found: ID does not exist" Mar 18 09:54:50 crc kubenswrapper[4778]: I0318 09:54:50.201485 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" path="/var/lib/kubelet/pods/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e/volumes" Mar 18 09:54:59 crc kubenswrapper[4778]: I0318 09:54:59.187499 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:54:59 crc kubenswrapper[4778]: E0318 09:54:59.188332 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:55:11 crc kubenswrapper[4778]: I0318 09:55:11.187356 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:55:11 crc kubenswrapper[4778]: E0318 09:55:11.188293 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.195369 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:55:24 crc kubenswrapper[4778]: E0318 09:55:24.197060 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.341514 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Mar 18 09:55:24 crc kubenswrapper[4778]: E0318 09:55:24.342061 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="extract-content" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.342092 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="extract-content" Mar 18 09:55:24 crc kubenswrapper[4778]: E0318 09:55:24.342125 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="extract-utilities" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.342137 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="extract-utilities" Mar 18 09:55:24 crc kubenswrapper[4778]: E0318 09:55:24.342189 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="registry-server" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.342224 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="registry-server" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.342549 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="registry-server" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.343524 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.346899 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-htxt6" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.347544 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.352556 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.352665 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.368513 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433402 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433478 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433603 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c89k6\" (UniqueName: \"kubernetes.io/projected/757e3758-d646-4267-8c4c-b5efb0dcf709-kube-api-access-c89k6\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433660 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433727 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433778 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433868 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433958 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.434025 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.434076 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536028 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536103 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536159 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536269 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536304 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536382 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c89k6\" (UniqueName: \"kubernetes.io/projected/757e3758-d646-4267-8c4c-b5efb0dcf709-kube-api-access-c89k6\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536420 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536472 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536509 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536571 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536995 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.537269 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.537464 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.537737 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.537886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.542815 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.548800 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.551436 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.552790 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.558863 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c89k6\" (UniqueName: \"kubernetes.io/projected/757e3758-d646-4267-8c4c-b5efb0dcf709-kube-api-access-c89k6\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.570608 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.684112 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:25 crc kubenswrapper[4778]: I0318 09:55:25.221156 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Mar 18 09:55:25 crc kubenswrapper[4778]: I0318 09:55:25.863575 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"757e3758-d646-4267-8c4c-b5efb0dcf709","Type":"ContainerStarted","Data":"dcac98cd78d62b2f03dd429a022d38c29d36c13fc170b830ecbd627ba6023d27"} Mar 18 09:55:35 crc kubenswrapper[4778]: I0318 09:55:35.187907 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:55:35 crc kubenswrapper[4778]: E0318 09:55:35.188746 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:55:46 crc kubenswrapper[4778]: I0318 09:55:46.187015 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:55:46 crc kubenswrapper[4778]: E0318 09:55:46.187835 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:55:52 crc kubenswrapper[4778]: E0318 09:55:52.791482 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 18 09:55:52 crc kubenswrapper[4778]: E0318 09:55:52.792302 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c89k6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest-s00-full_openstack(757e3758-d646-4267-8c4c-b5efb0dcf709): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:55:52 crc kubenswrapper[4778]: E0318 09:55:52.793514 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest-s00-full" podUID="757e3758-d646-4267-8c4c-b5efb0dcf709" Mar 18 09:55:53 crc kubenswrapper[4778]: E0318 09:55:53.134362 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest-s00-full" podUID="757e3758-d646-4267-8c4c-b5efb0dcf709" Mar 18 09:55:59 crc kubenswrapper[4778]: I0318 09:55:59.186790 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:55:59 crc kubenswrapper[4778]: E0318 09:55:59.187831 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.160460 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563796-r7bfh"] Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.162152 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.164351 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.165359 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.166611 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.211384 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563796-r7bfh"] Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.246768 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v2p9\" (UniqueName: \"kubernetes.io/projected/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7-kube-api-access-2v2p9\") pod \"auto-csr-approver-29563796-r7bfh\" (UID: \"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7\") " pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.351012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v2p9\" (UniqueName: \"kubernetes.io/projected/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7-kube-api-access-2v2p9\") pod \"auto-csr-approver-29563796-r7bfh\" (UID: \"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7\") " pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.370879 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v2p9\" (UniqueName: \"kubernetes.io/projected/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7-kube-api-access-2v2p9\") pod \"auto-csr-approver-29563796-r7bfh\" (UID: \"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7\") " pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.511686 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:01 crc kubenswrapper[4778]: I0318 09:56:00.997638 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563796-r7bfh"] Mar 18 09:56:01 crc kubenswrapper[4778]: I0318 09:56:01.213326 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" event={"ID":"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7","Type":"ContainerStarted","Data":"6045c1a6752ab8c67ab2035fa4d5ea43d54f98f6650cf09fa319ce731eca27f7"} Mar 18 09:56:03 crc kubenswrapper[4778]: I0318 09:56:03.243269 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" event={"ID":"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7","Type":"ContainerStarted","Data":"e43c5819f1ce670a94fb282c0d10a53cccd0c70528dde0f82e60b4168a0b1dd9"} Mar 18 09:56:03 crc kubenswrapper[4778]: I0318 09:56:03.274273 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" podStartSLOduration=1.5641289760000001 podStartE2EDuration="3.274250342s" podCreationTimestamp="2026-03-18 09:56:00 +0000 UTC" firstStartedPulling="2026-03-18 09:56:00.988933458 +0000 UTC m=+3227.563678328" lastFinishedPulling="2026-03-18 09:56:02.699054854 +0000 UTC m=+3229.273799694" observedRunningTime="2026-03-18 09:56:03.265910816 +0000 UTC m=+3229.840655676" watchObservedRunningTime="2026-03-18 09:56:03.274250342 +0000 UTC m=+3229.848995192" Mar 18 09:56:04 crc kubenswrapper[4778]: I0318 09:56:04.255338 4778 generic.go:334] "Generic (PLEG): container finished" podID="952d8866-a2f9-46d0-aa7b-5e578dd2f3c7" containerID="e43c5819f1ce670a94fb282c0d10a53cccd0c70528dde0f82e60b4168a0b1dd9" exitCode=0 Mar 18 09:56:04 crc kubenswrapper[4778]: I0318 09:56:04.255386 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" event={"ID":"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7","Type":"ContainerDied","Data":"e43c5819f1ce670a94fb282c0d10a53cccd0c70528dde0f82e60b4168a0b1dd9"} Mar 18 09:56:05 crc kubenswrapper[4778]: I0318 09:56:05.700890 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:05 crc kubenswrapper[4778]: I0318 09:56:05.771917 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v2p9\" (UniqueName: \"kubernetes.io/projected/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7-kube-api-access-2v2p9\") pod \"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7\" (UID: \"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7\") " Mar 18 09:56:05 crc kubenswrapper[4778]: I0318 09:56:05.780891 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7-kube-api-access-2v2p9" (OuterVolumeSpecName: "kube-api-access-2v2p9") pod "952d8866-a2f9-46d0-aa7b-5e578dd2f3c7" (UID: "952d8866-a2f9-46d0-aa7b-5e578dd2f3c7"). InnerVolumeSpecName "kube-api-access-2v2p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:56:05 crc kubenswrapper[4778]: I0318 09:56:05.875091 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v2p9\" (UniqueName: \"kubernetes.io/projected/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7-kube-api-access-2v2p9\") on node \"crc\" DevicePath \"\"" Mar 18 09:56:06 crc kubenswrapper[4778]: I0318 09:56:06.273388 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" event={"ID":"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7","Type":"ContainerDied","Data":"6045c1a6752ab8c67ab2035fa4d5ea43d54f98f6650cf09fa319ce731eca27f7"} Mar 18 09:56:06 crc kubenswrapper[4778]: I0318 09:56:06.273427 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6045c1a6752ab8c67ab2035fa4d5ea43d54f98f6650cf09fa319ce731eca27f7" Mar 18 09:56:06 crc kubenswrapper[4778]: I0318 09:56:06.273497 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:06 crc kubenswrapper[4778]: I0318 09:56:06.333060 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563790-nsf4n"] Mar 18 09:56:06 crc kubenswrapper[4778]: I0318 09:56:06.341349 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563790-nsf4n"] Mar 18 09:56:08 crc kubenswrapper[4778]: I0318 09:56:08.199049 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08fbf495-18e2-4d61-ad96-1bf74db07f0e" path="/var/lib/kubelet/pods/08fbf495-18e2-4d61-ad96-1bf74db07f0e/volumes" Mar 18 09:56:08 crc kubenswrapper[4778]: I0318 09:56:08.844641 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"757e3758-d646-4267-8c4c-b5efb0dcf709","Type":"ContainerStarted","Data":"c559ae3a1e4423e99c37d72f15f18f3cd16bc2838d62270df411dbac2afa6c1e"} Mar 18 09:56:08 crc kubenswrapper[4778]: I0318 09:56:08.871833 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s00-full" podStartSLOduration=4.457845329 podStartE2EDuration="45.871815798s" podCreationTimestamp="2026-03-18 09:55:23 +0000 UTC" firstStartedPulling="2026-03-18 09:55:25.230622091 +0000 UTC m=+3191.805366931" lastFinishedPulling="2026-03-18 09:56:06.64459256 +0000 UTC m=+3233.219337400" observedRunningTime="2026-03-18 09:56:08.863320918 +0000 UTC m=+3235.438065788" watchObservedRunningTime="2026-03-18 09:56:08.871815798 +0000 UTC m=+3235.446560638" Mar 18 09:56:14 crc kubenswrapper[4778]: I0318 09:56:14.199220 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:56:14 crc kubenswrapper[4778]: E0318 09:56:14.199963 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:56:25 crc kubenswrapper[4778]: I0318 09:56:25.187764 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:56:25 crc kubenswrapper[4778]: E0318 09:56:25.188780 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:56:33 crc kubenswrapper[4778]: I0318 09:56:33.224682 4778 scope.go:117] "RemoveContainer" containerID="58ef47c1a33dc103d35c1381547dc4531f738d5df6648d3b82a9b2e034b9599e" Mar 18 09:56:40 crc kubenswrapper[4778]: I0318 09:56:40.187124 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:56:40 crc kubenswrapper[4778]: E0318 09:56:40.188492 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:56:53 crc kubenswrapper[4778]: I0318 09:56:53.187739 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:56:53 crc kubenswrapper[4778]: E0318 09:56:53.189694 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:57:04 crc kubenswrapper[4778]: I0318 09:57:04.196763 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:57:04 crc kubenswrapper[4778]: E0318 09:57:04.197640 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:57:19 crc kubenswrapper[4778]: I0318 09:57:19.188227 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:57:19 crc kubenswrapper[4778]: E0318 09:57:19.189549 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:57:30 crc kubenswrapper[4778]: I0318 09:57:30.187783 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:57:30 crc kubenswrapper[4778]: E0318 09:57:30.189331 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:57:44 crc kubenswrapper[4778]: I0318 09:57:44.192754 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:57:44 crc kubenswrapper[4778]: E0318 09:57:44.193539 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:57:58 crc kubenswrapper[4778]: I0318 09:57:58.187843 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:57:58 crc kubenswrapper[4778]: E0318 09:57:58.188804 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.178618 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563798-7x8l8"] Mar 18 09:58:00 crc kubenswrapper[4778]: E0318 09:58:00.179994 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="952d8866-a2f9-46d0-aa7b-5e578dd2f3c7" containerName="oc" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.180010 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="952d8866-a2f9-46d0-aa7b-5e578dd2f3c7" containerName="oc" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.180256 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="952d8866-a2f9-46d0-aa7b-5e578dd2f3c7" containerName="oc" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.181269 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.183879 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.183898 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.185473 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.205176 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563798-7x8l8"] Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.276806 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smjz9\" (UniqueName: \"kubernetes.io/projected/18a30920-760a-4dd3-ac4a-63b9add62521-kube-api-access-smjz9\") pod \"auto-csr-approver-29563798-7x8l8\" (UID: \"18a30920-760a-4dd3-ac4a-63b9add62521\") " pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.378538 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smjz9\" (UniqueName: \"kubernetes.io/projected/18a30920-760a-4dd3-ac4a-63b9add62521-kube-api-access-smjz9\") pod \"auto-csr-approver-29563798-7x8l8\" (UID: \"18a30920-760a-4dd3-ac4a-63b9add62521\") " pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.397405 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smjz9\" (UniqueName: \"kubernetes.io/projected/18a30920-760a-4dd3-ac4a-63b9add62521-kube-api-access-smjz9\") pod \"auto-csr-approver-29563798-7x8l8\" (UID: \"18a30920-760a-4dd3-ac4a-63b9add62521\") " pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.501305 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.959868 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563798-7x8l8"] Mar 18 09:58:01 crc kubenswrapper[4778]: I0318 09:58:01.507421 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" event={"ID":"18a30920-760a-4dd3-ac4a-63b9add62521","Type":"ContainerStarted","Data":"472e677611501d03d17155c0638f3802c0bde32da5313b520bbd9299971c8985"} Mar 18 09:58:02 crc kubenswrapper[4778]: I0318 09:58:02.517794 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" event={"ID":"18a30920-760a-4dd3-ac4a-63b9add62521","Type":"ContainerStarted","Data":"d0bc455b5828f2dc8d018076f6b57f0e3f54f0daa7e1a53584affe8f4dab5285"} Mar 18 09:58:02 crc kubenswrapper[4778]: I0318 09:58:02.542908 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" podStartSLOduration=1.582905706 podStartE2EDuration="2.542888639s" podCreationTimestamp="2026-03-18 09:58:00 +0000 UTC" firstStartedPulling="2026-03-18 09:58:00.963327344 +0000 UTC m=+3347.538072184" lastFinishedPulling="2026-03-18 09:58:01.923310277 +0000 UTC m=+3348.498055117" observedRunningTime="2026-03-18 09:58:02.529352412 +0000 UTC m=+3349.104097272" watchObservedRunningTime="2026-03-18 09:58:02.542888639 +0000 UTC m=+3349.117633479" Mar 18 09:58:03 crc kubenswrapper[4778]: I0318 09:58:03.528692 4778 generic.go:334] "Generic (PLEG): container finished" podID="18a30920-760a-4dd3-ac4a-63b9add62521" containerID="d0bc455b5828f2dc8d018076f6b57f0e3f54f0daa7e1a53584affe8f4dab5285" exitCode=0 Mar 18 09:58:03 crc kubenswrapper[4778]: I0318 09:58:03.528722 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" event={"ID":"18a30920-760a-4dd3-ac4a-63b9add62521","Type":"ContainerDied","Data":"d0bc455b5828f2dc8d018076f6b57f0e3f54f0daa7e1a53584affe8f4dab5285"} Mar 18 09:58:04 crc kubenswrapper[4778]: I0318 09:58:04.917962 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.081348 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smjz9\" (UniqueName: \"kubernetes.io/projected/18a30920-760a-4dd3-ac4a-63b9add62521-kube-api-access-smjz9\") pod \"18a30920-760a-4dd3-ac4a-63b9add62521\" (UID: \"18a30920-760a-4dd3-ac4a-63b9add62521\") " Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.086882 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a30920-760a-4dd3-ac4a-63b9add62521-kube-api-access-smjz9" (OuterVolumeSpecName: "kube-api-access-smjz9") pod "18a30920-760a-4dd3-ac4a-63b9add62521" (UID: "18a30920-760a-4dd3-ac4a-63b9add62521"). InnerVolumeSpecName "kube-api-access-smjz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.184077 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smjz9\" (UniqueName: \"kubernetes.io/projected/18a30920-760a-4dd3-ac4a-63b9add62521-kube-api-access-smjz9\") on node \"crc\" DevicePath \"\"" Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.548218 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" event={"ID":"18a30920-760a-4dd3-ac4a-63b9add62521","Type":"ContainerDied","Data":"472e677611501d03d17155c0638f3802c0bde32da5313b520bbd9299971c8985"} Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.548746 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="472e677611501d03d17155c0638f3802c0bde32da5313b520bbd9299971c8985" Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.548289 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.626306 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563792-4g4zq"] Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.636593 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563792-4g4zq"] Mar 18 09:58:06 crc kubenswrapper[4778]: I0318 09:58:06.199923 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7095f92-8336-4c69-9c71-c3b9aa45bb82" path="/var/lib/kubelet/pods/a7095f92-8336-4c69-9c71-c3b9aa45bb82/volumes" Mar 18 09:58:11 crc kubenswrapper[4778]: I0318 09:58:11.187483 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:58:11 crc kubenswrapper[4778]: I0318 09:58:11.755327 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"4a5832ad33a24fa897a57dad207c0db67df6b8bbaaa82af43426329f350e2e07"} Mar 18 09:58:33 crc kubenswrapper[4778]: I0318 09:58:33.369741 4778 scope.go:117] "RemoveContainer" containerID="129d18099eafc9ec58cca914d6f8f45f3f345a43be2618db7b6619ab09177632" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.582300 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h2w8w"] Mar 18 09:59:08 crc kubenswrapper[4778]: E0318 09:59:08.583704 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a30920-760a-4dd3-ac4a-63b9add62521" containerName="oc" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.583730 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a30920-760a-4dd3-ac4a-63b9add62521" containerName="oc" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.584122 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a30920-760a-4dd3-ac4a-63b9add62521" containerName="oc" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.586859 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.704378 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-catalog-content\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.704479 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mp47\" (UniqueName: \"kubernetes.io/projected/0cf93b82-72dd-4fae-976b-bec6edb2e920-kube-api-access-5mp47\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.704558 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-utilities\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.806374 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-catalog-content\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.806434 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mp47\" (UniqueName: \"kubernetes.io/projected/0cf93b82-72dd-4fae-976b-bec6edb2e920-kube-api-access-5mp47\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.806464 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-utilities\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.807174 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-utilities\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.807321 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-catalog-content\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.827354 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mp47\" (UniqueName: \"kubernetes.io/projected/0cf93b82-72dd-4fae-976b-bec6edb2e920-kube-api-access-5mp47\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.913509 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h2w8w"] Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.927705 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:09 crc kubenswrapper[4778]: I0318 09:59:09.511865 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h2w8w"] Mar 18 09:59:10 crc kubenswrapper[4778]: I0318 09:59:10.334315 4778 generic.go:334] "Generic (PLEG): container finished" podID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerID="1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839" exitCode=0 Mar 18 09:59:10 crc kubenswrapper[4778]: I0318 09:59:10.334402 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerDied","Data":"1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839"} Mar 18 09:59:10 crc kubenswrapper[4778]: I0318 09:59:10.334608 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerStarted","Data":"77604ef018c62b9319353e20adb2f928cd62a0a17ae4608bb5ff128a91eb2c37"} Mar 18 09:59:10 crc kubenswrapper[4778]: I0318 09:59:10.338442 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:59:11 crc kubenswrapper[4778]: I0318 09:59:11.344210 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerStarted","Data":"b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe"} Mar 18 09:59:13 crc kubenswrapper[4778]: I0318 09:59:13.361648 4778 generic.go:334] "Generic (PLEG): container finished" podID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerID="b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe" exitCode=0 Mar 18 09:59:13 crc kubenswrapper[4778]: I0318 09:59:13.361952 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerDied","Data":"b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe"} Mar 18 09:59:14 crc kubenswrapper[4778]: I0318 09:59:14.372041 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerStarted","Data":"26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363"} Mar 18 09:59:14 crc kubenswrapper[4778]: I0318 09:59:14.400479 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h2w8w" podStartSLOduration=2.7813531190000003 podStartE2EDuration="6.400459343s" podCreationTimestamp="2026-03-18 09:59:08 +0000 UTC" firstStartedPulling="2026-03-18 09:59:10.338179281 +0000 UTC m=+3416.912924111" lastFinishedPulling="2026-03-18 09:59:13.957285495 +0000 UTC m=+3420.532030335" observedRunningTime="2026-03-18 09:59:14.391371207 +0000 UTC m=+3420.966116047" watchObservedRunningTime="2026-03-18 09:59:14.400459343 +0000 UTC m=+3420.975204193" Mar 18 09:59:18 crc kubenswrapper[4778]: I0318 09:59:18.929127 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:18 crc kubenswrapper[4778]: I0318 09:59:18.929808 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:18 crc kubenswrapper[4778]: I0318 09:59:18.988797 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:19 crc kubenswrapper[4778]: I0318 09:59:19.458817 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:21 crc kubenswrapper[4778]: I0318 09:59:21.316874 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h2w8w"] Mar 18 09:59:21 crc kubenswrapper[4778]: I0318 09:59:21.433158 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h2w8w" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="registry-server" containerID="cri-o://26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363" gracePeriod=2 Mar 18 09:59:21 crc kubenswrapper[4778]: I0318 09:59:21.944595 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.059494 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-utilities\") pod \"0cf93b82-72dd-4fae-976b-bec6edb2e920\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.059601 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mp47\" (UniqueName: \"kubernetes.io/projected/0cf93b82-72dd-4fae-976b-bec6edb2e920-kube-api-access-5mp47\") pod \"0cf93b82-72dd-4fae-976b-bec6edb2e920\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.059646 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-catalog-content\") pod \"0cf93b82-72dd-4fae-976b-bec6edb2e920\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.060555 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-utilities" (OuterVolumeSpecName: "utilities") pod "0cf93b82-72dd-4fae-976b-bec6edb2e920" (UID: "0cf93b82-72dd-4fae-976b-bec6edb2e920"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.066184 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf93b82-72dd-4fae-976b-bec6edb2e920-kube-api-access-5mp47" (OuterVolumeSpecName: "kube-api-access-5mp47") pod "0cf93b82-72dd-4fae-976b-bec6edb2e920" (UID: "0cf93b82-72dd-4fae-976b-bec6edb2e920"). InnerVolumeSpecName "kube-api-access-5mp47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.124835 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cf93b82-72dd-4fae-976b-bec6edb2e920" (UID: "0cf93b82-72dd-4fae-976b-bec6edb2e920"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.161682 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.161711 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mp47\" (UniqueName: \"kubernetes.io/projected/0cf93b82-72dd-4fae-976b-bec6edb2e920-kube-api-access-5mp47\") on node \"crc\" DevicePath \"\"" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.161723 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.447069 4778 generic.go:334] "Generic (PLEG): container finished" podID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerID="26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363" exitCode=0 Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.447123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerDied","Data":"26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363"} Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.447163 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.447217 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerDied","Data":"77604ef018c62b9319353e20adb2f928cd62a0a17ae4608bb5ff128a91eb2c37"} Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.447249 4778 scope.go:117] "RemoveContainer" containerID="26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.485715 4778 scope.go:117] "RemoveContainer" containerID="b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.500910 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h2w8w"] Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.510580 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h2w8w"] Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.512928 4778 scope.go:117] "RemoveContainer" containerID="1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.574177 4778 scope.go:117] "RemoveContainer" containerID="26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363" Mar 18 09:59:22 crc kubenswrapper[4778]: E0318 09:59:22.574761 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363\": container with ID starting with 26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363 not found: ID does not exist" containerID="26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.574823 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363"} err="failed to get container status \"26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363\": rpc error: code = NotFound desc = could not find container \"26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363\": container with ID starting with 26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363 not found: ID does not exist" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.574857 4778 scope.go:117] "RemoveContainer" containerID="b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe" Mar 18 09:59:22 crc kubenswrapper[4778]: E0318 09:59:22.575402 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe\": container with ID starting with b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe not found: ID does not exist" containerID="b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.575444 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe"} err="failed to get container status \"b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe\": rpc error: code = NotFound desc = could not find container \"b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe\": container with ID starting with b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe not found: ID does not exist" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.575474 4778 scope.go:117] "RemoveContainer" containerID="1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839" Mar 18 09:59:22 crc kubenswrapper[4778]: E0318 09:59:22.575776 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839\": container with ID starting with 1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839 not found: ID does not exist" containerID="1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.575805 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839"} err="failed to get container status \"1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839\": rpc error: code = NotFound desc = could not find container \"1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839\": container with ID starting with 1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839 not found: ID does not exist" Mar 18 09:59:24 crc kubenswrapper[4778]: I0318 09:59:24.201938 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" path="/var/lib/kubelet/pods/0cf93b82-72dd-4fae-976b-bec6edb2e920/volumes" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.149279 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563800-8grkw"] Mar 18 10:00:00 crc kubenswrapper[4778]: E0318 10:00:00.150426 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="registry-server" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.150446 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="registry-server" Mar 18 10:00:00 crc kubenswrapper[4778]: E0318 10:00:00.150464 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="extract-content" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.150471 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="extract-content" Mar 18 10:00:00 crc kubenswrapper[4778]: E0318 10:00:00.150495 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="extract-utilities" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.150503 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="extract-utilities" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.150744 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="registry-server" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.151585 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.155456 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.155719 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.156733 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.159507 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl"] Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.161156 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.163146 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.165910 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.182248 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563800-8grkw"] Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.247401 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl"] Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.254154 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca9f1133-0fec-4eeb-8b9b-39148a035a92-config-volume\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.254356 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4hks\" (UniqueName: \"kubernetes.io/projected/ca9f1133-0fec-4eeb-8b9b-39148a035a92-kube-api-access-m4hks\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.254431 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbp4c\" (UniqueName: \"kubernetes.io/projected/b7196caa-da0c-4933-b2d0-81c472bed9a9-kube-api-access-hbp4c\") pod \"auto-csr-approver-29563800-8grkw\" (UID: \"b7196caa-da0c-4933-b2d0-81c472bed9a9\") " pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.254472 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca9f1133-0fec-4eeb-8b9b-39148a035a92-secret-volume\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.357075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca9f1133-0fec-4eeb-8b9b-39148a035a92-config-volume\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.357301 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4hks\" (UniqueName: \"kubernetes.io/projected/ca9f1133-0fec-4eeb-8b9b-39148a035a92-kube-api-access-m4hks\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.358064 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca9f1133-0fec-4eeb-8b9b-39148a035a92-config-volume\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.358252 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbp4c\" (UniqueName: \"kubernetes.io/projected/b7196caa-da0c-4933-b2d0-81c472bed9a9-kube-api-access-hbp4c\") pod \"auto-csr-approver-29563800-8grkw\" (UID: \"b7196caa-da0c-4933-b2d0-81c472bed9a9\") " pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.358336 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca9f1133-0fec-4eeb-8b9b-39148a035a92-secret-volume\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.372935 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca9f1133-0fec-4eeb-8b9b-39148a035a92-secret-volume\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.378967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbp4c\" (UniqueName: \"kubernetes.io/projected/b7196caa-da0c-4933-b2d0-81c472bed9a9-kube-api-access-hbp4c\") pod \"auto-csr-approver-29563800-8grkw\" (UID: \"b7196caa-da0c-4933-b2d0-81c472bed9a9\") " pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.382683 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4hks\" (UniqueName: \"kubernetes.io/projected/ca9f1133-0fec-4eeb-8b9b-39148a035a92-kube-api-access-m4hks\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.498603 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.503884 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:01 crc kubenswrapper[4778]: I0318 10:00:01.205993 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl"] Mar 18 10:00:01 crc kubenswrapper[4778]: I0318 10:00:01.216471 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563800-8grkw"] Mar 18 10:00:01 crc kubenswrapper[4778]: I0318 10:00:01.875877 4778 generic.go:334] "Generic (PLEG): container finished" podID="ca9f1133-0fec-4eeb-8b9b-39148a035a92" containerID="20eba30be4d8526eb64b11cc9e3c58803630e3554035c19c9650d8cecb2ebf82" exitCode=0 Mar 18 10:00:01 crc kubenswrapper[4778]: I0318 10:00:01.876033 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" event={"ID":"ca9f1133-0fec-4eeb-8b9b-39148a035a92","Type":"ContainerDied","Data":"20eba30be4d8526eb64b11cc9e3c58803630e3554035c19c9650d8cecb2ebf82"} Mar 18 10:00:01 crc kubenswrapper[4778]: I0318 10:00:01.876179 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" event={"ID":"ca9f1133-0fec-4eeb-8b9b-39148a035a92","Type":"ContainerStarted","Data":"14a0de78c81ed2b08ab92fa03668d411cbfd11c81944dba852d6d77cb068f7d4"} Mar 18 10:00:01 crc kubenswrapper[4778]: I0318 10:00:01.877887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563800-8grkw" event={"ID":"b7196caa-da0c-4933-b2d0-81c472bed9a9","Type":"ContainerStarted","Data":"3294134c32825ebc6339fed6c44cda6f1e69b2d7521a0e5af1bdc7fae18305a0"} Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.266306 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.433399 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4hks\" (UniqueName: \"kubernetes.io/projected/ca9f1133-0fec-4eeb-8b9b-39148a035a92-kube-api-access-m4hks\") pod \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.433618 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca9f1133-0fec-4eeb-8b9b-39148a035a92-secret-volume\") pod \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.433760 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca9f1133-0fec-4eeb-8b9b-39148a035a92-config-volume\") pod \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.434753 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9f1133-0fec-4eeb-8b9b-39148a035a92-config-volume" (OuterVolumeSpecName: "config-volume") pod "ca9f1133-0fec-4eeb-8b9b-39148a035a92" (UID: "ca9f1133-0fec-4eeb-8b9b-39148a035a92"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.439630 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9f1133-0fec-4eeb-8b9b-39148a035a92-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ca9f1133-0fec-4eeb-8b9b-39148a035a92" (UID: "ca9f1133-0fec-4eeb-8b9b-39148a035a92"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.441011 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9f1133-0fec-4eeb-8b9b-39148a035a92-kube-api-access-m4hks" (OuterVolumeSpecName: "kube-api-access-m4hks") pod "ca9f1133-0fec-4eeb-8b9b-39148a035a92" (UID: "ca9f1133-0fec-4eeb-8b9b-39148a035a92"). InnerVolumeSpecName "kube-api-access-m4hks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.536576 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca9f1133-0fec-4eeb-8b9b-39148a035a92-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.536624 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca9f1133-0fec-4eeb-8b9b-39148a035a92-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.536634 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4hks\" (UniqueName: \"kubernetes.io/projected/ca9f1133-0fec-4eeb-8b9b-39148a035a92-kube-api-access-m4hks\") on node \"crc\" DevicePath \"\"" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.893321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" event={"ID":"ca9f1133-0fec-4eeb-8b9b-39148a035a92","Type":"ContainerDied","Data":"14a0de78c81ed2b08ab92fa03668d411cbfd11c81944dba852d6d77cb068f7d4"} Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.893353 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14a0de78c81ed2b08ab92fa03668d411cbfd11c81944dba852d6d77cb068f7d4" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.893372 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:04 crc kubenswrapper[4778]: I0318 10:00:04.357576 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6"] Mar 18 10:00:04 crc kubenswrapper[4778]: I0318 10:00:04.373100 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6"] Mar 18 10:00:06 crc kubenswrapper[4778]: I0318 10:00:06.201379 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea72845-4b27-4381-b08b-e0570c67bddb" path="/var/lib/kubelet/pods/bea72845-4b27-4381-b08b-e0570c67bddb/volumes" Mar 18 10:00:21 crc kubenswrapper[4778]: I0318 10:00:21.045110 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563800-8grkw" event={"ID":"b7196caa-da0c-4933-b2d0-81c472bed9a9","Type":"ContainerStarted","Data":"52b2bf061001e2a8dfe4b355dbc94585c1d208b337020ae42eb7ee2f487a7b0c"} Mar 18 10:00:21 crc kubenswrapper[4778]: I0318 10:00:21.070532 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563800-8grkw" podStartSLOduration=1.716889178 podStartE2EDuration="21.070508635s" podCreationTimestamp="2026-03-18 10:00:00 +0000 UTC" firstStartedPulling="2026-03-18 10:00:01.211945514 +0000 UTC m=+3467.786690354" lastFinishedPulling="2026-03-18 10:00:20.565564971 +0000 UTC m=+3487.140309811" observedRunningTime="2026-03-18 10:00:21.058443137 +0000 UTC m=+3487.633188007" watchObservedRunningTime="2026-03-18 10:00:21.070508635 +0000 UTC m=+3487.645253475" Mar 18 10:00:22 crc kubenswrapper[4778]: I0318 10:00:22.055725 4778 generic.go:334] "Generic (PLEG): container finished" podID="b7196caa-da0c-4933-b2d0-81c472bed9a9" containerID="52b2bf061001e2a8dfe4b355dbc94585c1d208b337020ae42eb7ee2f487a7b0c" exitCode=0 Mar 18 10:00:22 crc kubenswrapper[4778]: I0318 10:00:22.056217 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563800-8grkw" event={"ID":"b7196caa-da0c-4933-b2d0-81c472bed9a9","Type":"ContainerDied","Data":"52b2bf061001e2a8dfe4b355dbc94585c1d208b337020ae42eb7ee2f487a7b0c"} Mar 18 10:00:23 crc kubenswrapper[4778]: I0318 10:00:23.471145 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:23 crc kubenswrapper[4778]: I0318 10:00:23.534056 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbp4c\" (UniqueName: \"kubernetes.io/projected/b7196caa-da0c-4933-b2d0-81c472bed9a9-kube-api-access-hbp4c\") pod \"b7196caa-da0c-4933-b2d0-81c472bed9a9\" (UID: \"b7196caa-da0c-4933-b2d0-81c472bed9a9\") " Mar 18 10:00:23 crc kubenswrapper[4778]: I0318 10:00:23.541618 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7196caa-da0c-4933-b2d0-81c472bed9a9-kube-api-access-hbp4c" (OuterVolumeSpecName: "kube-api-access-hbp4c") pod "b7196caa-da0c-4933-b2d0-81c472bed9a9" (UID: "b7196caa-da0c-4933-b2d0-81c472bed9a9"). InnerVolumeSpecName "kube-api-access-hbp4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:00:23 crc kubenswrapper[4778]: I0318 10:00:23.635774 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbp4c\" (UniqueName: \"kubernetes.io/projected/b7196caa-da0c-4933-b2d0-81c472bed9a9-kube-api-access-hbp4c\") on node \"crc\" DevicePath \"\"" Mar 18 10:00:24 crc kubenswrapper[4778]: I0318 10:00:24.085394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563800-8grkw" event={"ID":"b7196caa-da0c-4933-b2d0-81c472bed9a9","Type":"ContainerDied","Data":"3294134c32825ebc6339fed6c44cda6f1e69b2d7521a0e5af1bdc7fae18305a0"} Mar 18 10:00:24 crc kubenswrapper[4778]: I0318 10:00:24.085773 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3294134c32825ebc6339fed6c44cda6f1e69b2d7521a0e5af1bdc7fae18305a0" Mar 18 10:00:24 crc kubenswrapper[4778]: I0318 10:00:24.085481 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:24 crc kubenswrapper[4778]: I0318 10:00:24.166321 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563794-586xm"] Mar 18 10:00:24 crc kubenswrapper[4778]: I0318 10:00:24.173999 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563794-586xm"] Mar 18 10:00:24 crc kubenswrapper[4778]: I0318 10:00:24.199799 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c681ab9e-5bfe-4e10-9154-41ff0c5d76a3" path="/var/lib/kubelet/pods/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3/volumes" Mar 18 10:00:30 crc kubenswrapper[4778]: I0318 10:00:30.147764 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:00:30 crc kubenswrapper[4778]: I0318 10:00:30.148347 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:00:33 crc kubenswrapper[4778]: I0318 10:00:33.498236 4778 scope.go:117] "RemoveContainer" containerID="1887b38177a6dc3b69f09e9dc6a6dd26a61cf63c5f532cbb5b0e04e1fb5a3d8b" Mar 18 10:00:33 crc kubenswrapper[4778]: I0318 10:00:33.562890 4778 scope.go:117] "RemoveContainer" containerID="eaf004108fc735124a6750b445bf1e3f7676efb1a3da3a71036d9a0909c64710" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.147539 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.148080 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.148096 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29563801-nctwn"] Mar 18 10:01:00 crc kubenswrapper[4778]: E0318 10:01:00.150577 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9f1133-0fec-4eeb-8b9b-39148a035a92" containerName="collect-profiles" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.150607 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9f1133-0fec-4eeb-8b9b-39148a035a92" containerName="collect-profiles" Mar 18 10:01:00 crc kubenswrapper[4778]: E0318 10:01:00.150639 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7196caa-da0c-4933-b2d0-81c472bed9a9" containerName="oc" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.150649 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7196caa-da0c-4933-b2d0-81c472bed9a9" containerName="oc" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.150956 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7196caa-da0c-4933-b2d0-81c472bed9a9" containerName="oc" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.150983 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca9f1133-0fec-4eeb-8b9b-39148a035a92" containerName="collect-profiles" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.151802 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.181788 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563801-nctwn"] Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.226794 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-fernet-keys\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.227167 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-combined-ca-bundle\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.227349 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsszw\" (UniqueName: \"kubernetes.io/projected/8ace9f11-f4d8-4801-afa2-5b723d52d41e-kube-api-access-bsszw\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.227594 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.330101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-combined-ca-bundle\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.330222 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsszw\" (UniqueName: \"kubernetes.io/projected/8ace9f11-f4d8-4801-afa2-5b723d52d41e-kube-api-access-bsszw\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.330319 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.330408 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-fernet-keys\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.336543 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-combined-ca-bundle\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.336733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-fernet-keys\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.339055 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.347866 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsszw\" (UniqueName: \"kubernetes.io/projected/8ace9f11-f4d8-4801-afa2-5b723d52d41e-kube-api-access-bsszw\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.475236 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.930256 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563801-nctwn"] Mar 18 10:01:01 crc kubenswrapper[4778]: I0318 10:01:01.414894 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563801-nctwn" event={"ID":"8ace9f11-f4d8-4801-afa2-5b723d52d41e","Type":"ContainerStarted","Data":"28d152bac9f17e0efab4925b14ece7afdb8366b297f19851fea386af5ff7041d"} Mar 18 10:01:01 crc kubenswrapper[4778]: I0318 10:01:01.415277 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563801-nctwn" event={"ID":"8ace9f11-f4d8-4801-afa2-5b723d52d41e","Type":"ContainerStarted","Data":"6861f7ab8ee5902e26632016dd16afe79b26f8e149799abe01220689f2fa3927"} Mar 18 10:01:01 crc kubenswrapper[4778]: I0318 10:01:01.433732 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29563801-nctwn" podStartSLOduration=1.433713488 podStartE2EDuration="1.433713488s" podCreationTimestamp="2026-03-18 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:01:01.429804193 +0000 UTC m=+3528.004549053" watchObservedRunningTime="2026-03-18 10:01:01.433713488 +0000 UTC m=+3528.008458328" Mar 18 10:01:04 crc kubenswrapper[4778]: I0318 10:01:04.440854 4778 generic.go:334] "Generic (PLEG): container finished" podID="8ace9f11-f4d8-4801-afa2-5b723d52d41e" containerID="28d152bac9f17e0efab4925b14ece7afdb8366b297f19851fea386af5ff7041d" exitCode=0 Mar 18 10:01:04 crc kubenswrapper[4778]: I0318 10:01:04.440926 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563801-nctwn" event={"ID":"8ace9f11-f4d8-4801-afa2-5b723d52d41e","Type":"ContainerDied","Data":"28d152bac9f17e0efab4925b14ece7afdb8366b297f19851fea386af5ff7041d"} Mar 18 10:01:05 crc kubenswrapper[4778]: I0318 10:01:05.933806 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.244354 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-fernet-keys\") pod \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.244471 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data\") pod \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.244560 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsszw\" (UniqueName: \"kubernetes.io/projected/8ace9f11-f4d8-4801-afa2-5b723d52d41e-kube-api-access-bsszw\") pod \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.244733 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-combined-ca-bundle\") pod \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.253654 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8ace9f11-f4d8-4801-afa2-5b723d52d41e" (UID: "8ace9f11-f4d8-4801-afa2-5b723d52d41e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.258548 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ace9f11-f4d8-4801-afa2-5b723d52d41e-kube-api-access-bsszw" (OuterVolumeSpecName: "kube-api-access-bsszw") pod "8ace9f11-f4d8-4801-afa2-5b723d52d41e" (UID: "8ace9f11-f4d8-4801-afa2-5b723d52d41e"). InnerVolumeSpecName "kube-api-access-bsszw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.330609 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ace9f11-f4d8-4801-afa2-5b723d52d41e" (UID: "8ace9f11-f4d8-4801-afa2-5b723d52d41e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.346499 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data" (OuterVolumeSpecName: "config-data") pod "8ace9f11-f4d8-4801-afa2-5b723d52d41e" (UID: "8ace9f11-f4d8-4801-afa2-5b723d52d41e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.346782 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data\") pod \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " Mar 18 10:01:06 crc kubenswrapper[4778]: W0318 10:01:06.346915 4778 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8ace9f11-f4d8-4801-afa2-5b723d52d41e/volumes/kubernetes.io~secret/config-data Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.346937 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data" (OuterVolumeSpecName: "config-data") pod "8ace9f11-f4d8-4801-afa2-5b723d52d41e" (UID: "8ace9f11-f4d8-4801-afa2-5b723d52d41e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.347586 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.347619 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.347633 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.347646 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsszw\" (UniqueName: \"kubernetes.io/projected/8ace9f11-f4d8-4801-afa2-5b723d52d41e-kube-api-access-bsszw\") on node \"crc\" DevicePath \"\"" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.456339 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563801-nctwn" event={"ID":"8ace9f11-f4d8-4801-afa2-5b723d52d41e","Type":"ContainerDied","Data":"6861f7ab8ee5902e26632016dd16afe79b26f8e149799abe01220689f2fa3927"} Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.456382 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6861f7ab8ee5902e26632016dd16afe79b26f8e149799abe01220689f2fa3927" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.456392 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:06 crc kubenswrapper[4778]: E0318 10:01:06.640659 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ace9f11_f4d8_4801_afa2_5b723d52d41e.slice/crio-6861f7ab8ee5902e26632016dd16afe79b26f8e149799abe01220689f2fa3927\": RecentStats: unable to find data in memory cache]" Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.147253 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.147892 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.147943 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.149075 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a5832ad33a24fa897a57dad207c0db67df6b8bbaaa82af43426329f350e2e07"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.149134 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://4a5832ad33a24fa897a57dad207c0db67df6b8bbaaa82af43426329f350e2e07" gracePeriod=600 Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.678144 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="4a5832ad33a24fa897a57dad207c0db67df6b8bbaaa82af43426329f350e2e07" exitCode=0 Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.678186 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"4a5832ad33a24fa897a57dad207c0db67df6b8bbaaa82af43426329f350e2e07"} Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.678915 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190"} Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.678959 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.151890 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563802-2blvs"] Mar 18 10:02:00 crc kubenswrapper[4778]: E0318 10:02:00.154096 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ace9f11-f4d8-4801-afa2-5b723d52d41e" containerName="keystone-cron" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.154241 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ace9f11-f4d8-4801-afa2-5b723d52d41e" containerName="keystone-cron" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.154655 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ace9f11-f4d8-4801-afa2-5b723d52d41e" containerName="keystone-cron" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.155649 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.158991 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.159020 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.161613 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563802-2blvs"] Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.164505 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.190967 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnb56\" (UniqueName: \"kubernetes.io/projected/02a9f934-8e78-4c0c-b0cc-59cd49030b5c-kube-api-access-vnb56\") pod \"auto-csr-approver-29563802-2blvs\" (UID: \"02a9f934-8e78-4c0c-b0cc-59cd49030b5c\") " pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.293157 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnb56\" (UniqueName: \"kubernetes.io/projected/02a9f934-8e78-4c0c-b0cc-59cd49030b5c-kube-api-access-vnb56\") pod \"auto-csr-approver-29563802-2blvs\" (UID: \"02a9f934-8e78-4c0c-b0cc-59cd49030b5c\") " pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.320113 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnb56\" (UniqueName: \"kubernetes.io/projected/02a9f934-8e78-4c0c-b0cc-59cd49030b5c-kube-api-access-vnb56\") pod \"auto-csr-approver-29563802-2blvs\" (UID: \"02a9f934-8e78-4c0c-b0cc-59cd49030b5c\") " pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.496066 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.985681 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563802-2blvs"] Mar 18 10:02:01 crc kubenswrapper[4778]: I0318 10:02:01.965489 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563802-2blvs" event={"ID":"02a9f934-8e78-4c0c-b0cc-59cd49030b5c","Type":"ContainerStarted","Data":"475b2926bb7d3ae12ec8845273a247fb8f09834a4a363bd68973f3e3039f20d1"} Mar 18 10:02:02 crc kubenswrapper[4778]: I0318 10:02:02.977262 4778 generic.go:334] "Generic (PLEG): container finished" podID="02a9f934-8e78-4c0c-b0cc-59cd49030b5c" containerID="048eac64aca1190d343bcd6e5968c051bfb3de1baa3c94a83313a7e1b9b996de" exitCode=0 Mar 18 10:02:02 crc kubenswrapper[4778]: I0318 10:02:02.977345 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563802-2blvs" event={"ID":"02a9f934-8e78-4c0c-b0cc-59cd49030b5c","Type":"ContainerDied","Data":"048eac64aca1190d343bcd6e5968c051bfb3de1baa3c94a83313a7e1b9b996de"} Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.527624 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.676691 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnb56\" (UniqueName: \"kubernetes.io/projected/02a9f934-8e78-4c0c-b0cc-59cd49030b5c-kube-api-access-vnb56\") pod \"02a9f934-8e78-4c0c-b0cc-59cd49030b5c\" (UID: \"02a9f934-8e78-4c0c-b0cc-59cd49030b5c\") " Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.695444 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a9f934-8e78-4c0c-b0cc-59cd49030b5c-kube-api-access-vnb56" (OuterVolumeSpecName: "kube-api-access-vnb56") pod "02a9f934-8e78-4c0c-b0cc-59cd49030b5c" (UID: "02a9f934-8e78-4c0c-b0cc-59cd49030b5c"). InnerVolumeSpecName "kube-api-access-vnb56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.779470 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnb56\" (UniqueName: \"kubernetes.io/projected/02a9f934-8e78-4c0c-b0cc-59cd49030b5c-kube-api-access-vnb56\") on node \"crc\" DevicePath \"\"" Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.996492 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563802-2blvs" event={"ID":"02a9f934-8e78-4c0c-b0cc-59cd49030b5c","Type":"ContainerDied","Data":"475b2926bb7d3ae12ec8845273a247fb8f09834a4a363bd68973f3e3039f20d1"} Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.996548 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="475b2926bb7d3ae12ec8845273a247fb8f09834a4a363bd68973f3e3039f20d1" Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.996624 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:05 crc kubenswrapper[4778]: I0318 10:02:05.601461 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563796-r7bfh"] Mar 18 10:02:05 crc kubenswrapper[4778]: I0318 10:02:05.610433 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563796-r7bfh"] Mar 18 10:02:06 crc kubenswrapper[4778]: I0318 10:02:06.198424 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="952d8866-a2f9-46d0-aa7b-5e578dd2f3c7" path="/var/lib/kubelet/pods/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7/volumes" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.858822 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qb8tg"] Mar 18 10:02:30 crc kubenswrapper[4778]: E0318 10:02:30.868687 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a9f934-8e78-4c0c-b0cc-59cd49030b5c" containerName="oc" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.868946 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a9f934-8e78-4c0c-b0cc-59cd49030b5c" containerName="oc" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.869757 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a9f934-8e78-4c0c-b0cc-59cd49030b5c" containerName="oc" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.872912 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.902920 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb8tg"] Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.952066 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-catalog-content\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.952477 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-utilities\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.952617 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsvsd\" (UniqueName: \"kubernetes.io/projected/0c2e606c-94aa-4c97-aef4-741fc7402bac-kube-api-access-qsvsd\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.055072 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-utilities\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.055147 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsvsd\" (UniqueName: \"kubernetes.io/projected/0c2e606c-94aa-4c97-aef4-741fc7402bac-kube-api-access-qsvsd\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.055288 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-catalog-content\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.056224 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-utilities\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.056334 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-catalog-content\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.075509 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsvsd\" (UniqueName: \"kubernetes.io/projected/0c2e606c-94aa-4c97-aef4-741fc7402bac-kube-api-access-qsvsd\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.210177 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.728224 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb8tg"] Mar 18 10:02:32 crc kubenswrapper[4778]: I0318 10:02:32.244690 4778 generic.go:334] "Generic (PLEG): container finished" podID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerID="c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13" exitCode=0 Mar 18 10:02:32 crc kubenswrapper[4778]: I0318 10:02:32.245057 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerDied","Data":"c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13"} Mar 18 10:02:32 crc kubenswrapper[4778]: I0318 10:02:32.245136 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerStarted","Data":"a55e38478b117edd30ba7fd52a6bbb758ea162a56775c1ebaade4a1d8fc531b7"} Mar 18 10:02:33 crc kubenswrapper[4778]: I0318 10:02:33.255141 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerStarted","Data":"611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51"} Mar 18 10:02:33 crc kubenswrapper[4778]: I0318 10:02:33.655585 4778 scope.go:117] "RemoveContainer" containerID="e43c5819f1ce670a94fb282c0d10a53cccd0c70528dde0f82e60b4168a0b1dd9" Mar 18 10:02:34 crc kubenswrapper[4778]: I0318 10:02:34.266095 4778 generic.go:334] "Generic (PLEG): container finished" podID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerID="611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51" exitCode=0 Mar 18 10:02:34 crc kubenswrapper[4778]: I0318 10:02:34.266268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerDied","Data":"611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51"} Mar 18 10:02:35 crc kubenswrapper[4778]: I0318 10:02:35.278228 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerStarted","Data":"db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020"} Mar 18 10:02:35 crc kubenswrapper[4778]: I0318 10:02:35.308895 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qb8tg" podStartSLOduration=2.871759668 podStartE2EDuration="5.308867688s" podCreationTimestamp="2026-03-18 10:02:30 +0000 UTC" firstStartedPulling="2026-03-18 10:02:32.247843609 +0000 UTC m=+3618.822588449" lastFinishedPulling="2026-03-18 10:02:34.684951629 +0000 UTC m=+3621.259696469" observedRunningTime="2026-03-18 10:02:35.294473717 +0000 UTC m=+3621.869218557" watchObservedRunningTime="2026-03-18 10:02:35.308867688 +0000 UTC m=+3621.883612528" Mar 18 10:02:41 crc kubenswrapper[4778]: I0318 10:02:41.211256 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:41 crc kubenswrapper[4778]: I0318 10:02:41.212096 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:41 crc kubenswrapper[4778]: I0318 10:02:41.256965 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:41 crc kubenswrapper[4778]: I0318 10:02:41.377005 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:41 crc kubenswrapper[4778]: I0318 10:02:41.498433 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb8tg"] Mar 18 10:02:43 crc kubenswrapper[4778]: I0318 10:02:43.347585 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qb8tg" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="registry-server" containerID="cri-o://db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020" gracePeriod=2 Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.036600 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.128937 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-catalog-content\") pod \"0c2e606c-94aa-4c97-aef4-741fc7402bac\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.129045 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-utilities\") pod \"0c2e606c-94aa-4c97-aef4-741fc7402bac\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.129181 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsvsd\" (UniqueName: \"kubernetes.io/projected/0c2e606c-94aa-4c97-aef4-741fc7402bac-kube-api-access-qsvsd\") pod \"0c2e606c-94aa-4c97-aef4-741fc7402bac\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.130736 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-utilities" (OuterVolumeSpecName: "utilities") pod "0c2e606c-94aa-4c97-aef4-741fc7402bac" (UID: "0c2e606c-94aa-4c97-aef4-741fc7402bac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.136743 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2e606c-94aa-4c97-aef4-741fc7402bac-kube-api-access-qsvsd" (OuterVolumeSpecName: "kube-api-access-qsvsd") pod "0c2e606c-94aa-4c97-aef4-741fc7402bac" (UID: "0c2e606c-94aa-4c97-aef4-741fc7402bac"). InnerVolumeSpecName "kube-api-access-qsvsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.155884 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c2e606c-94aa-4c97-aef4-741fc7402bac" (UID: "0c2e606c-94aa-4c97-aef4-741fc7402bac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.234163 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsvsd\" (UniqueName: \"kubernetes.io/projected/0c2e606c-94aa-4c97-aef4-741fc7402bac-kube-api-access-qsvsd\") on node \"crc\" DevicePath \"\"" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.234527 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.234544 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.358279 4778 generic.go:334] "Generic (PLEG): container finished" podID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerID="db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020" exitCode=0 Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.358322 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerDied","Data":"db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020"} Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.358352 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerDied","Data":"a55e38478b117edd30ba7fd52a6bbb758ea162a56775c1ebaade4a1d8fc531b7"} Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.358377 4778 scope.go:117] "RemoveContainer" containerID="db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.358371 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.383506 4778 scope.go:117] "RemoveContainer" containerID="611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.386363 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb8tg"] Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.399804 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb8tg"] Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.407040 4778 scope.go:117] "RemoveContainer" containerID="c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.458901 4778 scope.go:117] "RemoveContainer" containerID="db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020" Mar 18 10:02:44 crc kubenswrapper[4778]: E0318 10:02:44.459513 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020\": container with ID starting with db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020 not found: ID does not exist" containerID="db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.459566 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020"} err="failed to get container status \"db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020\": rpc error: code = NotFound desc = could not find container \"db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020\": container with ID starting with db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020 not found: ID does not exist" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.459600 4778 scope.go:117] "RemoveContainer" containerID="611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51" Mar 18 10:02:44 crc kubenswrapper[4778]: E0318 10:02:44.460004 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51\": container with ID starting with 611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51 not found: ID does not exist" containerID="611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.460045 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51"} err="failed to get container status \"611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51\": rpc error: code = NotFound desc = could not find container \"611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51\": container with ID starting with 611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51 not found: ID does not exist" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.460073 4778 scope.go:117] "RemoveContainer" containerID="c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13" Mar 18 10:02:44 crc kubenswrapper[4778]: E0318 10:02:44.460432 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13\": container with ID starting with c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13 not found: ID does not exist" containerID="c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.460472 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13"} err="failed to get container status \"c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13\": rpc error: code = NotFound desc = could not find container \"c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13\": container with ID starting with c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13 not found: ID does not exist" Mar 18 10:02:46 crc kubenswrapper[4778]: I0318 10:02:46.199262 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" path="/var/lib/kubelet/pods/0c2e606c-94aa-4c97-aef4-741fc7402bac/volumes" Mar 18 10:03:19 crc kubenswrapper[4778]: I0318 10:03:19.037074 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-91af-account-create-update-cc4d5"] Mar 18 10:03:19 crc kubenswrapper[4778]: I0318 10:03:19.045828 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-j5mf6"] Mar 18 10:03:19 crc kubenswrapper[4778]: I0318 10:03:19.054596 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-91af-account-create-update-cc4d5"] Mar 18 10:03:19 crc kubenswrapper[4778]: I0318 10:03:19.063999 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-j5mf6"] Mar 18 10:03:20 crc kubenswrapper[4778]: I0318 10:03:20.196846 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57dd6190-5149-44a9-8a75-7e3d9077a43c" path="/var/lib/kubelet/pods/57dd6190-5149-44a9-8a75-7e3d9077a43c/volumes" Mar 18 10:03:20 crc kubenswrapper[4778]: I0318 10:03:20.198115 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" path="/var/lib/kubelet/pods/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0/volumes" Mar 18 10:03:30 crc kubenswrapper[4778]: I0318 10:03:30.147698 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:03:30 crc kubenswrapper[4778]: I0318 10:03:30.148358 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:03:33 crc kubenswrapper[4778]: I0318 10:03:33.844834 4778 scope.go:117] "RemoveContainer" containerID="fc0fef996b4b9a5437de59c8b2ac8a5e7d95ba6ac33a74f54e7f79985c001e66" Mar 18 10:03:33 crc kubenswrapper[4778]: I0318 10:03:33.872496 4778 scope.go:117] "RemoveContainer" containerID="a03567ec27a1b447b64f31d017f135a530faf96727b9cdb25f25df3c2b11ab27" Mar 18 10:03:38 crc kubenswrapper[4778]: I0318 10:03:38.035319 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-fnlvs"] Mar 18 10:03:38 crc kubenswrapper[4778]: I0318 10:03:38.044361 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-fnlvs"] Mar 18 10:03:38 crc kubenswrapper[4778]: I0318 10:03:38.198253 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f9ef2c-6a05-438a-a701-92c9ef84d46d" path="/var/lib/kubelet/pods/86f9ef2c-6a05-438a-a701-92c9ef84d46d/volumes" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.147053 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563804-fx6ns"] Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.147477 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.147999 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:04:00 crc kubenswrapper[4778]: E0318 10:04:00.148299 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="registry-server" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.148318 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="registry-server" Mar 18 10:04:00 crc kubenswrapper[4778]: E0318 10:04:00.148332 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="extract-utilities" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.148341 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="extract-utilities" Mar 18 10:04:00 crc kubenswrapper[4778]: E0318 10:04:00.148357 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="extract-content" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.148367 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="extract-content" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.148620 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="registry-server" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.149508 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.151388 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.152538 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.152683 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.162828 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563804-fx6ns"] Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.248924 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-799vx\" (UniqueName: \"kubernetes.io/projected/5479124a-5b9d-403a-baf8-0e03ec15c707-kube-api-access-799vx\") pod \"auto-csr-approver-29563804-fx6ns\" (UID: \"5479124a-5b9d-403a-baf8-0e03ec15c707\") " pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.352999 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-799vx\" (UniqueName: \"kubernetes.io/projected/5479124a-5b9d-403a-baf8-0e03ec15c707-kube-api-access-799vx\") pod \"auto-csr-approver-29563804-fx6ns\" (UID: \"5479124a-5b9d-403a-baf8-0e03ec15c707\") " pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.381042 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-799vx\" (UniqueName: \"kubernetes.io/projected/5479124a-5b9d-403a-baf8-0e03ec15c707-kube-api-access-799vx\") pod \"auto-csr-approver-29563804-fx6ns\" (UID: \"5479124a-5b9d-403a-baf8-0e03ec15c707\") " pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.473607 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.988119 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563804-fx6ns"] Mar 18 10:04:01 crc kubenswrapper[4778]: I0318 10:04:01.042766 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" event={"ID":"5479124a-5b9d-403a-baf8-0e03ec15c707","Type":"ContainerStarted","Data":"53a513fbd91a8f5fce5776158592c3922ba4bd0582fe66a2b01c5a8c1dbc7219"} Mar 18 10:04:03 crc kubenswrapper[4778]: I0318 10:04:03.068994 4778 generic.go:334] "Generic (PLEG): container finished" podID="5479124a-5b9d-403a-baf8-0e03ec15c707" containerID="9989303583a93d583be4ddef72e12372e2db44e67cf888c7373ff47c4f5bfee9" exitCode=0 Mar 18 10:04:03 crc kubenswrapper[4778]: I0318 10:04:03.069071 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" event={"ID":"5479124a-5b9d-403a-baf8-0e03ec15c707","Type":"ContainerDied","Data":"9989303583a93d583be4ddef72e12372e2db44e67cf888c7373ff47c4f5bfee9"} Mar 18 10:04:04 crc kubenswrapper[4778]: I0318 10:04:04.645104 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:04 crc kubenswrapper[4778]: I0318 10:04:04.745019 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-799vx\" (UniqueName: \"kubernetes.io/projected/5479124a-5b9d-403a-baf8-0e03ec15c707-kube-api-access-799vx\") pod \"5479124a-5b9d-403a-baf8-0e03ec15c707\" (UID: \"5479124a-5b9d-403a-baf8-0e03ec15c707\") " Mar 18 10:04:04 crc kubenswrapper[4778]: I0318 10:04:04.751594 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5479124a-5b9d-403a-baf8-0e03ec15c707-kube-api-access-799vx" (OuterVolumeSpecName: "kube-api-access-799vx") pod "5479124a-5b9d-403a-baf8-0e03ec15c707" (UID: "5479124a-5b9d-403a-baf8-0e03ec15c707"). InnerVolumeSpecName "kube-api-access-799vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:04:04 crc kubenswrapper[4778]: I0318 10:04:04.847896 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-799vx\" (UniqueName: \"kubernetes.io/projected/5479124a-5b9d-403a-baf8-0e03ec15c707-kube-api-access-799vx\") on node \"crc\" DevicePath \"\"" Mar 18 10:04:05 crc kubenswrapper[4778]: I0318 10:04:05.088670 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" event={"ID":"5479124a-5b9d-403a-baf8-0e03ec15c707","Type":"ContainerDied","Data":"53a513fbd91a8f5fce5776158592c3922ba4bd0582fe66a2b01c5a8c1dbc7219"} Mar 18 10:04:05 crc kubenswrapper[4778]: I0318 10:04:05.088981 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53a513fbd91a8f5fce5776158592c3922ba4bd0582fe66a2b01c5a8c1dbc7219" Mar 18 10:04:05 crc kubenswrapper[4778]: I0318 10:04:05.088705 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:05 crc kubenswrapper[4778]: I0318 10:04:05.712972 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563798-7x8l8"] Mar 18 10:04:05 crc kubenswrapper[4778]: I0318 10:04:05.728023 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563798-7x8l8"] Mar 18 10:04:06 crc kubenswrapper[4778]: I0318 10:04:06.197498 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18a30920-760a-4dd3-ac4a-63b9add62521" path="/var/lib/kubelet/pods/18a30920-760a-4dd3-ac4a-63b9add62521/volumes" Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.148064 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.148788 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.148855 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.150043 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.150150 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" gracePeriod=600 Mar 18 10:04:30 crc kubenswrapper[4778]: E0318 10:04:30.275019 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.306861 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" exitCode=0 Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.306924 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190"} Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.306980 4778 scope.go:117] "RemoveContainer" containerID="4a5832ad33a24fa897a57dad207c0db67df6b8bbaaa82af43426329f350e2e07" Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.308279 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:04:30 crc kubenswrapper[4778]: E0318 10:04:30.308676 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:04:33 crc kubenswrapper[4778]: I0318 10:04:33.982563 4778 scope.go:117] "RemoveContainer" containerID="c09837191b6ba16c2c4c1ba6934469f4e373f1877eaa0a419ae86d94a526194d" Mar 18 10:04:34 crc kubenswrapper[4778]: I0318 10:04:34.017691 4778 scope.go:117] "RemoveContainer" containerID="d0bc455b5828f2dc8d018076f6b57f0e3f54f0daa7e1a53584affe8f4dab5285" Mar 18 10:04:41 crc kubenswrapper[4778]: I0318 10:04:41.186699 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:04:41 crc kubenswrapper[4778]: E0318 10:04:41.187677 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:04:52 crc kubenswrapper[4778]: I0318 10:04:52.187041 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:04:52 crc kubenswrapper[4778]: E0318 10:04:52.187898 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:05:04 crc kubenswrapper[4778]: I0318 10:05:04.194278 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:05:04 crc kubenswrapper[4778]: E0318 10:05:04.195573 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:05:17 crc kubenswrapper[4778]: I0318 10:05:17.188392 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:05:17 crc kubenswrapper[4778]: E0318 10:05:17.189137 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:05:32 crc kubenswrapper[4778]: I0318 10:05:32.187848 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:05:32 crc kubenswrapper[4778]: E0318 10:05:32.188950 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.042998 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w6ds2"] Mar 18 10:05:42 crc kubenswrapper[4778]: E0318 10:05:42.044567 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5479124a-5b9d-403a-baf8-0e03ec15c707" containerName="oc" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.044600 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5479124a-5b9d-403a-baf8-0e03ec15c707" containerName="oc" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.045067 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5479124a-5b9d-403a-baf8-0e03ec15c707" containerName="oc" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.047620 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.060148 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w6ds2"] Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.191911 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qnh6\" (UniqueName: \"kubernetes.io/projected/3985ebd1-17ce-47b8-b029-b521e40d6bb2-kube-api-access-8qnh6\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.191999 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-utilities\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.192067 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-catalog-content\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.293640 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qnh6\" (UniqueName: \"kubernetes.io/projected/3985ebd1-17ce-47b8-b029-b521e40d6bb2-kube-api-access-8qnh6\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.293792 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-utilities\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.293876 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-catalog-content\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.296503 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-utilities\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.296884 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-catalog-content\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.322092 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qnh6\" (UniqueName: \"kubernetes.io/projected/3985ebd1-17ce-47b8-b029-b521e40d6bb2-kube-api-access-8qnh6\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.391277 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.971146 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w6ds2"] Mar 18 10:05:43 crc kubenswrapper[4778]: I0318 10:05:43.187600 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:05:43 crc kubenswrapper[4778]: E0318 10:05:43.187928 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:05:43 crc kubenswrapper[4778]: I0318 10:05:43.936927 4778 generic.go:334] "Generic (PLEG): container finished" podID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerID="a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156" exitCode=0 Mar 18 10:05:43 crc kubenswrapper[4778]: I0318 10:05:43.937281 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerDied","Data":"a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156"} Mar 18 10:05:43 crc kubenswrapper[4778]: I0318 10:05:43.938241 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerStarted","Data":"ff1791fd6dc7166759becaa22e236a584f49d5a34e46f1ae3cd70fe242aa8182"} Mar 18 10:05:43 crc kubenswrapper[4778]: I0318 10:05:43.939816 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.225767 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k9mc4"] Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.228549 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.254011 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9mc4"] Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.335679 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-utilities\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.336649 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-catalog-content\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.336756 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgqzh\" (UniqueName: \"kubernetes.io/projected/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-kube-api-access-fgqzh\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.438047 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-catalog-content\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.438106 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgqzh\" (UniqueName: \"kubernetes.io/projected/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-kube-api-access-fgqzh\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.438254 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-utilities\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.438699 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-utilities\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.438919 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-catalog-content\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.460642 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgqzh\" (UniqueName: \"kubernetes.io/projected/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-kube-api-access-fgqzh\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.553542 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:45 crc kubenswrapper[4778]: I0318 10:05:45.154055 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9mc4"] Mar 18 10:05:45 crc kubenswrapper[4778]: I0318 10:05:45.972384 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerStarted","Data":"025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942"} Mar 18 10:05:45 crc kubenswrapper[4778]: I0318 10:05:45.974725 4778 generic.go:334] "Generic (PLEG): container finished" podID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerID="71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db" exitCode=0 Mar 18 10:05:45 crc kubenswrapper[4778]: I0318 10:05:45.974770 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerDied","Data":"71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db"} Mar 18 10:05:45 crc kubenswrapper[4778]: I0318 10:05:45.974801 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerStarted","Data":"76f017bfaa7bd549bf50c3e81af429cb8139358679676c8625b5bb5e1d24e86b"} Mar 18 10:05:47 crc kubenswrapper[4778]: I0318 10:05:47.997618 4778 generic.go:334] "Generic (PLEG): container finished" podID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerID="025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942" exitCode=0 Mar 18 10:05:47 crc kubenswrapper[4778]: I0318 10:05:47.998010 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerDied","Data":"025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942"} Mar 18 10:05:48 crc kubenswrapper[4778]: I0318 10:05:48.007156 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerStarted","Data":"4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1"} Mar 18 10:05:49 crc kubenswrapper[4778]: I0318 10:05:49.018727 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerStarted","Data":"aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792"} Mar 18 10:05:49 crc kubenswrapper[4778]: I0318 10:05:49.021564 4778 generic.go:334] "Generic (PLEG): container finished" podID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerID="4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1" exitCode=0 Mar 18 10:05:49 crc kubenswrapper[4778]: I0318 10:05:49.021653 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerDied","Data":"4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1"} Mar 18 10:05:49 crc kubenswrapper[4778]: I0318 10:05:49.050283 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w6ds2" podStartSLOduration=2.376242895 podStartE2EDuration="7.050265869s" podCreationTimestamp="2026-03-18 10:05:42 +0000 UTC" firstStartedPulling="2026-03-18 10:05:43.939532672 +0000 UTC m=+3810.514277512" lastFinishedPulling="2026-03-18 10:05:48.613555646 +0000 UTC m=+3815.188300486" observedRunningTime="2026-03-18 10:05:49.049646672 +0000 UTC m=+3815.624391532" watchObservedRunningTime="2026-03-18 10:05:49.050265869 +0000 UTC m=+3815.625010709" Mar 18 10:05:50 crc kubenswrapper[4778]: I0318 10:05:50.032579 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerStarted","Data":"aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89"} Mar 18 10:05:50 crc kubenswrapper[4778]: I0318 10:05:50.053472 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k9mc4" podStartSLOduration=2.595072555 podStartE2EDuration="6.053451446s" podCreationTimestamp="2026-03-18 10:05:44 +0000 UTC" firstStartedPulling="2026-03-18 10:05:45.976980745 +0000 UTC m=+3812.551725585" lastFinishedPulling="2026-03-18 10:05:49.435359636 +0000 UTC m=+3816.010104476" observedRunningTime="2026-03-18 10:05:50.050054054 +0000 UTC m=+3816.624798904" watchObservedRunningTime="2026-03-18 10:05:50.053451446 +0000 UTC m=+3816.628196286" Mar 18 10:05:52 crc kubenswrapper[4778]: I0318 10:05:52.392102 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:52 crc kubenswrapper[4778]: I0318 10:05:52.392481 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:53 crc kubenswrapper[4778]: I0318 10:05:53.458305 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w6ds2" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="registry-server" probeResult="failure" output=< Mar 18 10:05:53 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:05:53 crc kubenswrapper[4778]: > Mar 18 10:05:54 crc kubenswrapper[4778]: I0318 10:05:54.554836 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:54 crc kubenswrapper[4778]: I0318 10:05:54.555160 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:54 crc kubenswrapper[4778]: I0318 10:05:54.608625 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:55 crc kubenswrapper[4778]: I0318 10:05:55.116176 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:55 crc kubenswrapper[4778]: I0318 10:05:55.160666 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k9mc4"] Mar 18 10:05:55 crc kubenswrapper[4778]: I0318 10:05:55.187172 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:05:55 crc kubenswrapper[4778]: E0318 10:05:55.187656 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.091411 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k9mc4" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="registry-server" containerID="cri-o://aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89" gracePeriod=2 Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.795566 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.920465 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-catalog-content\") pod \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.920602 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-utilities\") pod \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.920648 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgqzh\" (UniqueName: \"kubernetes.io/projected/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-kube-api-access-fgqzh\") pod \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.921444 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-utilities" (OuterVolumeSpecName: "utilities") pod "3ceaceba-a6ca-4b0f-8964-8079b9dbb102" (UID: "3ceaceba-a6ca-4b0f-8964-8079b9dbb102"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.933297 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-kube-api-access-fgqzh" (OuterVolumeSpecName: "kube-api-access-fgqzh") pod "3ceaceba-a6ca-4b0f-8964-8079b9dbb102" (UID: "3ceaceba-a6ca-4b0f-8964-8079b9dbb102"). InnerVolumeSpecName "kube-api-access-fgqzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.973936 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ceaceba-a6ca-4b0f-8964-8079b9dbb102" (UID: "3ceaceba-a6ca-4b0f-8964-8079b9dbb102"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.022781 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.022818 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgqzh\" (UniqueName: \"kubernetes.io/projected/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-kube-api-access-fgqzh\") on node \"crc\" DevicePath \"\"" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.022830 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.103049 4778 generic.go:334] "Generic (PLEG): container finished" podID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerID="aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89" exitCode=0 Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.103095 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerDied","Data":"aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89"} Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.103129 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerDied","Data":"76f017bfaa7bd549bf50c3e81af429cb8139358679676c8625b5bb5e1d24e86b"} Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.103129 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.103165 4778 scope.go:117] "RemoveContainer" containerID="aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.123703 4778 scope.go:117] "RemoveContainer" containerID="4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.160530 4778 scope.go:117] "RemoveContainer" containerID="71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.167628 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k9mc4"] Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.176271 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k9mc4"] Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.206686 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" path="/var/lib/kubelet/pods/3ceaceba-a6ca-4b0f-8964-8079b9dbb102/volumes" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.208128 4778 scope.go:117] "RemoveContainer" containerID="aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89" Mar 18 10:05:58 crc kubenswrapper[4778]: E0318 10:05:58.208770 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89\": container with ID starting with aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89 not found: ID does not exist" containerID="aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.208804 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89"} err="failed to get container status \"aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89\": rpc error: code = NotFound desc = could not find container \"aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89\": container with ID starting with aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89 not found: ID does not exist" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.208829 4778 scope.go:117] "RemoveContainer" containerID="4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1" Mar 18 10:05:58 crc kubenswrapper[4778]: E0318 10:05:58.209405 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1\": container with ID starting with 4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1 not found: ID does not exist" containerID="4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.209486 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1"} err="failed to get container status \"4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1\": rpc error: code = NotFound desc = could not find container \"4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1\": container with ID starting with 4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1 not found: ID does not exist" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.209543 4778 scope.go:117] "RemoveContainer" containerID="71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db" Mar 18 10:05:58 crc kubenswrapper[4778]: E0318 10:05:58.209989 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db\": container with ID starting with 71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db not found: ID does not exist" containerID="71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.210028 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db"} err="failed to get container status \"71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db\": rpc error: code = NotFound desc = could not find container \"71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db\": container with ID starting with 71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db not found: ID does not exist" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.152396 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563806-n2m7x"] Mar 18 10:06:00 crc kubenswrapper[4778]: E0318 10:06:00.152805 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="extract-utilities" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.152820 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="extract-utilities" Mar 18 10:06:00 crc kubenswrapper[4778]: E0318 10:06:00.152842 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="registry-server" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.152850 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="registry-server" Mar 18 10:06:00 crc kubenswrapper[4778]: E0318 10:06:00.152874 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="extract-content" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.152882 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="extract-content" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.153087 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="registry-server" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.153819 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.157155 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.157229 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.157350 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.172274 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563806-n2m7x"] Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.268638 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwlh2\" (UniqueName: \"kubernetes.io/projected/dbb455c0-90cf-46a9-82c4-1c22d05e007d-kube-api-access-lwlh2\") pod \"auto-csr-approver-29563806-n2m7x\" (UID: \"dbb455c0-90cf-46a9-82c4-1c22d05e007d\") " pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.371753 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwlh2\" (UniqueName: \"kubernetes.io/projected/dbb455c0-90cf-46a9-82c4-1c22d05e007d-kube-api-access-lwlh2\") pod \"auto-csr-approver-29563806-n2m7x\" (UID: \"dbb455c0-90cf-46a9-82c4-1c22d05e007d\") " pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.395768 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwlh2\" (UniqueName: \"kubernetes.io/projected/dbb455c0-90cf-46a9-82c4-1c22d05e007d-kube-api-access-lwlh2\") pod \"auto-csr-approver-29563806-n2m7x\" (UID: \"dbb455c0-90cf-46a9-82c4-1c22d05e007d\") " pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.475934 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:00 crc kubenswrapper[4778]: W0318 10:06:00.937958 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbb455c0_90cf_46a9_82c4_1c22d05e007d.slice/crio-9c8d8d17a89e5e3a0a16943424709a6a2b796859194054adb0a830bd877ec159 WatchSource:0}: Error finding container 9c8d8d17a89e5e3a0a16943424709a6a2b796859194054adb0a830bd877ec159: Status 404 returned error can't find the container with id 9c8d8d17a89e5e3a0a16943424709a6a2b796859194054adb0a830bd877ec159 Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.945765 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563806-n2m7x"] Mar 18 10:06:01 crc kubenswrapper[4778]: I0318 10:06:01.143174 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" event={"ID":"dbb455c0-90cf-46a9-82c4-1c22d05e007d","Type":"ContainerStarted","Data":"9c8d8d17a89e5e3a0a16943424709a6a2b796859194054adb0a830bd877ec159"} Mar 18 10:06:03 crc kubenswrapper[4778]: I0318 10:06:03.165746 4778 generic.go:334] "Generic (PLEG): container finished" podID="dbb455c0-90cf-46a9-82c4-1c22d05e007d" containerID="6b16a1172e80d155110d49b222a6dc20e7b21d0a6a9927e8e5966e673f37ac10" exitCode=0 Mar 18 10:06:03 crc kubenswrapper[4778]: I0318 10:06:03.165816 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" event={"ID":"dbb455c0-90cf-46a9-82c4-1c22d05e007d","Type":"ContainerDied","Data":"6b16a1172e80d155110d49b222a6dc20e7b21d0a6a9927e8e5966e673f37ac10"} Mar 18 10:06:03 crc kubenswrapper[4778]: I0318 10:06:03.446810 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w6ds2" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="registry-server" probeResult="failure" output=< Mar 18 10:06:03 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:06:03 crc kubenswrapper[4778]: > Mar 18 10:06:04 crc kubenswrapper[4778]: I0318 10:06:04.803643 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:04 crc kubenswrapper[4778]: I0318 10:06:04.865893 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwlh2\" (UniqueName: \"kubernetes.io/projected/dbb455c0-90cf-46a9-82c4-1c22d05e007d-kube-api-access-lwlh2\") pod \"dbb455c0-90cf-46a9-82c4-1c22d05e007d\" (UID: \"dbb455c0-90cf-46a9-82c4-1c22d05e007d\") " Mar 18 10:06:04 crc kubenswrapper[4778]: I0318 10:06:04.872680 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb455c0-90cf-46a9-82c4-1c22d05e007d-kube-api-access-lwlh2" (OuterVolumeSpecName: "kube-api-access-lwlh2") pod "dbb455c0-90cf-46a9-82c4-1c22d05e007d" (UID: "dbb455c0-90cf-46a9-82c4-1c22d05e007d"). InnerVolumeSpecName "kube-api-access-lwlh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:06:04 crc kubenswrapper[4778]: I0318 10:06:04.968527 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwlh2\" (UniqueName: \"kubernetes.io/projected/dbb455c0-90cf-46a9-82c4-1c22d05e007d-kube-api-access-lwlh2\") on node \"crc\" DevicePath \"\"" Mar 18 10:06:05 crc kubenswrapper[4778]: I0318 10:06:05.183431 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" event={"ID":"dbb455c0-90cf-46a9-82c4-1c22d05e007d","Type":"ContainerDied","Data":"9c8d8d17a89e5e3a0a16943424709a6a2b796859194054adb0a830bd877ec159"} Mar 18 10:06:05 crc kubenswrapper[4778]: I0318 10:06:05.183470 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c8d8d17a89e5e3a0a16943424709a6a2b796859194054adb0a830bd877ec159" Mar 18 10:06:05 crc kubenswrapper[4778]: I0318 10:06:05.183474 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:05 crc kubenswrapper[4778]: I0318 10:06:05.882277 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563800-8grkw"] Mar 18 10:06:05 crc kubenswrapper[4778]: I0318 10:06:05.890822 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563800-8grkw"] Mar 18 10:06:06 crc kubenswrapper[4778]: I0318 10:06:06.188796 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:06:06 crc kubenswrapper[4778]: E0318 10:06:06.189165 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:06:06 crc kubenswrapper[4778]: I0318 10:06:06.199589 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7196caa-da0c-4933-b2d0-81c472bed9a9" path="/var/lib/kubelet/pods/b7196caa-da0c-4933-b2d0-81c472bed9a9/volumes" Mar 18 10:06:12 crc kubenswrapper[4778]: I0318 10:06:12.441672 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:06:12 crc kubenswrapper[4778]: I0318 10:06:12.490008 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:06:13 crc kubenswrapper[4778]: I0318 10:06:13.232017 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w6ds2"] Mar 18 10:06:14 crc kubenswrapper[4778]: I0318 10:06:14.279283 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w6ds2" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="registry-server" containerID="cri-o://aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792" gracePeriod=2 Mar 18 10:06:14 crc kubenswrapper[4778]: I0318 10:06:14.921578 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.070623 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qnh6\" (UniqueName: \"kubernetes.io/projected/3985ebd1-17ce-47b8-b029-b521e40d6bb2-kube-api-access-8qnh6\") pod \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.070795 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-utilities\") pod \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.070844 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-catalog-content\") pod \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.072065 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-utilities" (OuterVolumeSpecName: "utilities") pod "3985ebd1-17ce-47b8-b029-b521e40d6bb2" (UID: "3985ebd1-17ce-47b8-b029-b521e40d6bb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.082460 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3985ebd1-17ce-47b8-b029-b521e40d6bb2-kube-api-access-8qnh6" (OuterVolumeSpecName: "kube-api-access-8qnh6") pod "3985ebd1-17ce-47b8-b029-b521e40d6bb2" (UID: "3985ebd1-17ce-47b8-b029-b521e40d6bb2"). InnerVolumeSpecName "kube-api-access-8qnh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.173116 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.173159 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qnh6\" (UniqueName: \"kubernetes.io/projected/3985ebd1-17ce-47b8-b029-b521e40d6bb2-kube-api-access-8qnh6\") on node \"crc\" DevicePath \"\"" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.228922 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3985ebd1-17ce-47b8-b029-b521e40d6bb2" (UID: "3985ebd1-17ce-47b8-b029-b521e40d6bb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.275739 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.291425 4778 generic.go:334] "Generic (PLEG): container finished" podID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerID="aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792" exitCode=0 Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.291476 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerDied","Data":"aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792"} Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.291512 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerDied","Data":"ff1791fd6dc7166759becaa22e236a584f49d5a34e46f1ae3cd70fe242aa8182"} Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.291534 4778 scope.go:117] "RemoveContainer" containerID="aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.292462 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.320654 4778 scope.go:117] "RemoveContainer" containerID="025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.335412 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w6ds2"] Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.343030 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w6ds2"] Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.345990 4778 scope.go:117] "RemoveContainer" containerID="a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.412940 4778 scope.go:117] "RemoveContainer" containerID="aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792" Mar 18 10:06:15 crc kubenswrapper[4778]: E0318 10:06:15.415475 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792\": container with ID starting with aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792 not found: ID does not exist" containerID="aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.415519 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792"} err="failed to get container status \"aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792\": rpc error: code = NotFound desc = could not find container \"aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792\": container with ID starting with aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792 not found: ID does not exist" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.415545 4778 scope.go:117] "RemoveContainer" containerID="025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942" Mar 18 10:06:15 crc kubenswrapper[4778]: E0318 10:06:15.416024 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942\": container with ID starting with 025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942 not found: ID does not exist" containerID="025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.416073 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942"} err="failed to get container status \"025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942\": rpc error: code = NotFound desc = could not find container \"025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942\": container with ID starting with 025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942 not found: ID does not exist" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.416100 4778 scope.go:117] "RemoveContainer" containerID="a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156" Mar 18 10:06:15 crc kubenswrapper[4778]: E0318 10:06:15.416446 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156\": container with ID starting with a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156 not found: ID does not exist" containerID="a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.416472 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156"} err="failed to get container status \"a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156\": rpc error: code = NotFound desc = could not find container \"a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156\": container with ID starting with a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156 not found: ID does not exist" Mar 18 10:06:16 crc kubenswrapper[4778]: I0318 10:06:16.201794 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" path="/var/lib/kubelet/pods/3985ebd1-17ce-47b8-b029-b521e40d6bb2/volumes" Mar 18 10:06:18 crc kubenswrapper[4778]: I0318 10:06:18.187773 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:06:18 crc kubenswrapper[4778]: E0318 10:06:18.188249 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:06:31 crc kubenswrapper[4778]: I0318 10:06:31.188935 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:06:31 crc kubenswrapper[4778]: E0318 10:06:31.189711 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:06:34 crc kubenswrapper[4778]: I0318 10:06:34.157880 4778 scope.go:117] "RemoveContainer" containerID="52b2bf061001e2a8dfe4b355dbc94585c1d208b337020ae42eb7ee2f487a7b0c" Mar 18 10:06:44 crc kubenswrapper[4778]: I0318 10:06:44.193801 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:06:44 crc kubenswrapper[4778]: E0318 10:06:44.194712 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:06:57 crc kubenswrapper[4778]: I0318 10:06:57.187967 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:06:57 crc kubenswrapper[4778]: E0318 10:06:57.188688 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:07:08 crc kubenswrapper[4778]: I0318 10:07:08.187175 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:07:08 crc kubenswrapper[4778]: E0318 10:07:08.187812 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:07:21 crc kubenswrapper[4778]: I0318 10:07:21.187470 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:07:21 crc kubenswrapper[4778]: E0318 10:07:21.189008 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:07:35 crc kubenswrapper[4778]: I0318 10:07:35.187178 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:07:35 crc kubenswrapper[4778]: E0318 10:07:35.188221 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:07:46 crc kubenswrapper[4778]: I0318 10:07:46.186945 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:07:46 crc kubenswrapper[4778]: E0318 10:07:46.187799 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.154771 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563808-8zn6m"] Mar 18 10:08:00 crc kubenswrapper[4778]: E0318 10:08:00.159767 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="extract-utilities" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.160083 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="extract-utilities" Mar 18 10:08:00 crc kubenswrapper[4778]: E0318 10:08:00.160107 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="registry-server" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.160117 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="registry-server" Mar 18 10:08:00 crc kubenswrapper[4778]: E0318 10:08:00.160138 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="extract-content" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.160148 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="extract-content" Mar 18 10:08:00 crc kubenswrapper[4778]: E0318 10:08:00.160164 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb455c0-90cf-46a9-82c4-1c22d05e007d" containerName="oc" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.160171 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb455c0-90cf-46a9-82c4-1c22d05e007d" containerName="oc" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.160413 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="registry-server" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.160434 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbb455c0-90cf-46a9-82c4-1c22d05e007d" containerName="oc" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.161272 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.163167 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.163652 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.164222 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.176979 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563808-8zn6m"] Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.187750 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:08:00 crc kubenswrapper[4778]: E0318 10:08:00.187977 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.282772 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms4bw\" (UniqueName: \"kubernetes.io/projected/0a5c01e2-3264-47f4-8081-48235752ef32-kube-api-access-ms4bw\") pod \"auto-csr-approver-29563808-8zn6m\" (UID: \"0a5c01e2-3264-47f4-8081-48235752ef32\") " pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.384054 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms4bw\" (UniqueName: \"kubernetes.io/projected/0a5c01e2-3264-47f4-8081-48235752ef32-kube-api-access-ms4bw\") pod \"auto-csr-approver-29563808-8zn6m\" (UID: \"0a5c01e2-3264-47f4-8081-48235752ef32\") " pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.410986 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms4bw\" (UniqueName: \"kubernetes.io/projected/0a5c01e2-3264-47f4-8081-48235752ef32-kube-api-access-ms4bw\") pod \"auto-csr-approver-29563808-8zn6m\" (UID: \"0a5c01e2-3264-47f4-8081-48235752ef32\") " pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.482804 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:01 crc kubenswrapper[4778]: I0318 10:08:01.082515 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563808-8zn6m"] Mar 18 10:08:01 crc kubenswrapper[4778]: I0318 10:08:01.202559 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" event={"ID":"0a5c01e2-3264-47f4-8081-48235752ef32","Type":"ContainerStarted","Data":"cd72fbd5ff8b1d95f025048cf58b89d1f994e5eafbc56a9f42db0e9ae2c10dd2"} Mar 18 10:08:03 crc kubenswrapper[4778]: I0318 10:08:03.220509 4778 generic.go:334] "Generic (PLEG): container finished" podID="0a5c01e2-3264-47f4-8081-48235752ef32" containerID="763e05415a17628575fc7b5a79a4b0a9348cfd1deec11024a98e0e749d405535" exitCode=0 Mar 18 10:08:03 crc kubenswrapper[4778]: I0318 10:08:03.220598 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" event={"ID":"0a5c01e2-3264-47f4-8081-48235752ef32","Type":"ContainerDied","Data":"763e05415a17628575fc7b5a79a4b0a9348cfd1deec11024a98e0e749d405535"} Mar 18 10:08:04 crc kubenswrapper[4778]: I0318 10:08:04.863649 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:04.999808 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms4bw\" (UniqueName: \"kubernetes.io/projected/0a5c01e2-3264-47f4-8081-48235752ef32-kube-api-access-ms4bw\") pod \"0a5c01e2-3264-47f4-8081-48235752ef32\" (UID: \"0a5c01e2-3264-47f4-8081-48235752ef32\") " Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.019384 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a5c01e2-3264-47f4-8081-48235752ef32-kube-api-access-ms4bw" (OuterVolumeSpecName: "kube-api-access-ms4bw") pod "0a5c01e2-3264-47f4-8081-48235752ef32" (UID: "0a5c01e2-3264-47f4-8081-48235752ef32"). InnerVolumeSpecName "kube-api-access-ms4bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.102307 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms4bw\" (UniqueName: \"kubernetes.io/projected/0a5c01e2-3264-47f4-8081-48235752ef32-kube-api-access-ms4bw\") on node \"crc\" DevicePath \"\"" Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.240923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" event={"ID":"0a5c01e2-3264-47f4-8081-48235752ef32","Type":"ContainerDied","Data":"cd72fbd5ff8b1d95f025048cf58b89d1f994e5eafbc56a9f42db0e9ae2c10dd2"} Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.240966 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd72fbd5ff8b1d95f025048cf58b89d1f994e5eafbc56a9f42db0e9ae2c10dd2" Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.241005 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.939294 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563802-2blvs"] Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.950569 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563802-2blvs"] Mar 18 10:08:06 crc kubenswrapper[4778]: I0318 10:08:06.198024 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a9f934-8e78-4c0c-b0cc-59cd49030b5c" path="/var/lib/kubelet/pods/02a9f934-8e78-4c0c-b0cc-59cd49030b5c/volumes" Mar 18 10:08:13 crc kubenswrapper[4778]: I0318 10:08:13.188331 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:08:13 crc kubenswrapper[4778]: E0318 10:08:13.189055 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:08:26 crc kubenswrapper[4778]: I0318 10:08:26.187370 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:08:26 crc kubenswrapper[4778]: E0318 10:08:26.188551 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:08:34 crc kubenswrapper[4778]: I0318 10:08:34.286453 4778 scope.go:117] "RemoveContainer" containerID="048eac64aca1190d343bcd6e5968c051bfb3de1baa3c94a83313a7e1b9b996de" Mar 18 10:08:38 crc kubenswrapper[4778]: I0318 10:08:38.187574 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:08:38 crc kubenswrapper[4778]: E0318 10:08:38.189087 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:08:53 crc kubenswrapper[4778]: I0318 10:08:53.187854 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:08:53 crc kubenswrapper[4778]: E0318 10:08:53.188721 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:09:07 crc kubenswrapper[4778]: I0318 10:09:07.188030 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:09:07 crc kubenswrapper[4778]: E0318 10:09:07.188858 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:09:20 crc kubenswrapper[4778]: I0318 10:09:20.188900 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:09:20 crc kubenswrapper[4778]: E0318 10:09:20.190129 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:09:34 crc kubenswrapper[4778]: I0318 10:09:34.194653 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:09:34 crc kubenswrapper[4778]: I0318 10:09:34.727667 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"d5e0db80372568acf47512b0578afc67f28216d99c8be88c569e735224dbd2f3"} Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.162162 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563810-gbsth"] Mar 18 10:10:00 crc kubenswrapper[4778]: E0318 10:10:00.163264 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a5c01e2-3264-47f4-8081-48235752ef32" containerName="oc" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.163281 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a5c01e2-3264-47f4-8081-48235752ef32" containerName="oc" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.163514 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a5c01e2-3264-47f4-8081-48235752ef32" containerName="oc" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.164377 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.167539 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.169579 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.169942 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.178934 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563810-gbsth"] Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.267567 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjxh5\" (UniqueName: \"kubernetes.io/projected/2da82992-5b46-450f-9fe2-fb1aab2e40a5-kube-api-access-wjxh5\") pod \"auto-csr-approver-29563810-gbsth\" (UID: \"2da82992-5b46-450f-9fe2-fb1aab2e40a5\") " pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.369592 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjxh5\" (UniqueName: \"kubernetes.io/projected/2da82992-5b46-450f-9fe2-fb1aab2e40a5-kube-api-access-wjxh5\") pod \"auto-csr-approver-29563810-gbsth\" (UID: \"2da82992-5b46-450f-9fe2-fb1aab2e40a5\") " pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.389704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjxh5\" (UniqueName: \"kubernetes.io/projected/2da82992-5b46-450f-9fe2-fb1aab2e40a5-kube-api-access-wjxh5\") pod \"auto-csr-approver-29563810-gbsth\" (UID: \"2da82992-5b46-450f-9fe2-fb1aab2e40a5\") " pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.485210 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:01 crc kubenswrapper[4778]: I0318 10:10:01.101428 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563810-gbsth"] Mar 18 10:10:01 crc kubenswrapper[4778]: I0318 10:10:01.973394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563810-gbsth" event={"ID":"2da82992-5b46-450f-9fe2-fb1aab2e40a5","Type":"ContainerStarted","Data":"1ac94c05b44a514cc09d0c7992208feacdd1efd309af86fc6adb31463d969cf4"} Mar 18 10:10:02 crc kubenswrapper[4778]: I0318 10:10:02.982755 4778 generic.go:334] "Generic (PLEG): container finished" podID="2da82992-5b46-450f-9fe2-fb1aab2e40a5" containerID="92b3bdd08fed961c28977d075899ad8197ec334db09149cee7b1c8a99c5b48b6" exitCode=0 Mar 18 10:10:02 crc kubenswrapper[4778]: I0318 10:10:02.982805 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563810-gbsth" event={"ID":"2da82992-5b46-450f-9fe2-fb1aab2e40a5","Type":"ContainerDied","Data":"92b3bdd08fed961c28977d075899ad8197ec334db09149cee7b1c8a99c5b48b6"} Mar 18 10:10:04 crc kubenswrapper[4778]: I0318 10:10:04.528251 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:04 crc kubenswrapper[4778]: I0318 10:10:04.660635 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjxh5\" (UniqueName: \"kubernetes.io/projected/2da82992-5b46-450f-9fe2-fb1aab2e40a5-kube-api-access-wjxh5\") pod \"2da82992-5b46-450f-9fe2-fb1aab2e40a5\" (UID: \"2da82992-5b46-450f-9fe2-fb1aab2e40a5\") " Mar 18 10:10:04 crc kubenswrapper[4778]: I0318 10:10:04.671690 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da82992-5b46-450f-9fe2-fb1aab2e40a5-kube-api-access-wjxh5" (OuterVolumeSpecName: "kube-api-access-wjxh5") pod "2da82992-5b46-450f-9fe2-fb1aab2e40a5" (UID: "2da82992-5b46-450f-9fe2-fb1aab2e40a5"). InnerVolumeSpecName "kube-api-access-wjxh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:10:04 crc kubenswrapper[4778]: I0318 10:10:04.763387 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjxh5\" (UniqueName: \"kubernetes.io/projected/2da82992-5b46-450f-9fe2-fb1aab2e40a5-kube-api-access-wjxh5\") on node \"crc\" DevicePath \"\"" Mar 18 10:10:05 crc kubenswrapper[4778]: I0318 10:10:05.001689 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563810-gbsth" event={"ID":"2da82992-5b46-450f-9fe2-fb1aab2e40a5","Type":"ContainerDied","Data":"1ac94c05b44a514cc09d0c7992208feacdd1efd309af86fc6adb31463d969cf4"} Mar 18 10:10:05 crc kubenswrapper[4778]: I0318 10:10:05.001746 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ac94c05b44a514cc09d0c7992208feacdd1efd309af86fc6adb31463d969cf4" Mar 18 10:10:05 crc kubenswrapper[4778]: I0318 10:10:05.001812 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:05 crc kubenswrapper[4778]: I0318 10:10:05.609171 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563804-fx6ns"] Mar 18 10:10:05 crc kubenswrapper[4778]: I0318 10:10:05.618928 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563804-fx6ns"] Mar 18 10:10:06 crc kubenswrapper[4778]: I0318 10:10:06.199973 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5479124a-5b9d-403a-baf8-0e03ec15c707" path="/var/lib/kubelet/pods/5479124a-5b9d-403a-baf8-0e03ec15c707/volumes" Mar 18 10:10:34 crc kubenswrapper[4778]: I0318 10:10:34.384868 4778 scope.go:117] "RemoveContainer" containerID="9989303583a93d583be4ddef72e12372e2db44e67cf888c7373ff47c4f5bfee9" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.670471 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9lp4z"] Mar 18 10:11:37 crc kubenswrapper[4778]: E0318 10:11:37.671567 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da82992-5b46-450f-9fe2-fb1aab2e40a5" containerName="oc" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.671586 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da82992-5b46-450f-9fe2-fb1aab2e40a5" containerName="oc" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.671847 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da82992-5b46-450f-9fe2-fb1aab2e40a5" containerName="oc" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.673569 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.683495 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9lp4z"] Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.851103 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hmb4\" (UniqueName: \"kubernetes.io/projected/e86477b8-6733-463c-9c1e-3fdc3af149c8-kube-api-access-2hmb4\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.851245 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-utilities\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.851425 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-catalog-content\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.953296 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hmb4\" (UniqueName: \"kubernetes.io/projected/e86477b8-6733-463c-9c1e-3fdc3af149c8-kube-api-access-2hmb4\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.953375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-utilities\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.953450 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-catalog-content\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.953903 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-catalog-content\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.954028 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-utilities\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.973507 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hmb4\" (UniqueName: \"kubernetes.io/projected/e86477b8-6733-463c-9c1e-3fdc3af149c8-kube-api-access-2hmb4\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:38 crc kubenswrapper[4778]: I0318 10:11:38.005742 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:38 crc kubenswrapper[4778]: I0318 10:11:38.535775 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9lp4z"] Mar 18 10:11:38 crc kubenswrapper[4778]: I0318 10:11:38.816040 4778 generic.go:334] "Generic (PLEG): container finished" podID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerID="12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e" exitCode=0 Mar 18 10:11:38 crc kubenswrapper[4778]: I0318 10:11:38.816146 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerDied","Data":"12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e"} Mar 18 10:11:38 crc kubenswrapper[4778]: I0318 10:11:38.816418 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerStarted","Data":"4b279340952032e1de97141653a8272e40b4b4504477e11e39bed863cada10e3"} Mar 18 10:11:38 crc kubenswrapper[4778]: I0318 10:11:38.817737 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:11:39 crc kubenswrapper[4778]: I0318 10:11:39.825458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerStarted","Data":"cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2"} Mar 18 10:11:41 crc kubenswrapper[4778]: I0318 10:11:41.842707 4778 generic.go:334] "Generic (PLEG): container finished" podID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerID="cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2" exitCode=0 Mar 18 10:11:41 crc kubenswrapper[4778]: I0318 10:11:41.842929 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerDied","Data":"cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2"} Mar 18 10:11:42 crc kubenswrapper[4778]: I0318 10:11:42.856774 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerStarted","Data":"d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810"} Mar 18 10:11:42 crc kubenswrapper[4778]: I0318 10:11:42.888871 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9lp4z" podStartSLOduration=2.282660185 podStartE2EDuration="5.888852493s" podCreationTimestamp="2026-03-18 10:11:37 +0000 UTC" firstStartedPulling="2026-03-18 10:11:38.817505542 +0000 UTC m=+4165.392250382" lastFinishedPulling="2026-03-18 10:11:42.42369785 +0000 UTC m=+4168.998442690" observedRunningTime="2026-03-18 10:11:42.88690285 +0000 UTC m=+4169.461647690" watchObservedRunningTime="2026-03-18 10:11:42.888852493 +0000 UTC m=+4169.463597333" Mar 18 10:11:48 crc kubenswrapper[4778]: I0318 10:11:48.006224 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:48 crc kubenswrapper[4778]: I0318 10:11:48.006581 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:48 crc kubenswrapper[4778]: I0318 10:11:48.085329 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:48 crc kubenswrapper[4778]: I0318 10:11:48.963405 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:50 crc kubenswrapper[4778]: I0318 10:11:50.418672 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9lp4z"] Mar 18 10:11:50 crc kubenswrapper[4778]: I0318 10:11:50.933570 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9lp4z" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="registry-server" containerID="cri-o://d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810" gracePeriod=2 Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.729165 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.881620 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-catalog-content\") pod \"e86477b8-6733-463c-9c1e-3fdc3af149c8\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.881799 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-utilities\") pod \"e86477b8-6733-463c-9c1e-3fdc3af149c8\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.881938 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hmb4\" (UniqueName: \"kubernetes.io/projected/e86477b8-6733-463c-9c1e-3fdc3af149c8-kube-api-access-2hmb4\") pod \"e86477b8-6733-463c-9c1e-3fdc3af149c8\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.882566 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-utilities" (OuterVolumeSpecName: "utilities") pod "e86477b8-6733-463c-9c1e-3fdc3af149c8" (UID: "e86477b8-6733-463c-9c1e-3fdc3af149c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.889259 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86477b8-6733-463c-9c1e-3fdc3af149c8-kube-api-access-2hmb4" (OuterVolumeSpecName: "kube-api-access-2hmb4") pod "e86477b8-6733-463c-9c1e-3fdc3af149c8" (UID: "e86477b8-6733-463c-9c1e-3fdc3af149c8"). InnerVolumeSpecName "kube-api-access-2hmb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.936309 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e86477b8-6733-463c-9c1e-3fdc3af149c8" (UID: "e86477b8-6733-463c-9c1e-3fdc3af149c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.945068 4778 generic.go:334] "Generic (PLEG): container finished" podID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerID="d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810" exitCode=0 Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.945121 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerDied","Data":"d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810"} Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.945148 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerDied","Data":"4b279340952032e1de97141653a8272e40b4b4504477e11e39bed863cada10e3"} Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.945165 4778 scope.go:117] "RemoveContainer" containerID="d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.945470 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.966520 4778 scope.go:117] "RemoveContainer" containerID="cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.983842 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9lp4z"] Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.985309 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.985337 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.985354 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hmb4\" (UniqueName: \"kubernetes.io/projected/e86477b8-6733-463c-9c1e-3fdc3af149c8-kube-api-access-2hmb4\") on node \"crc\" DevicePath \"\"" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.993890 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9lp4z"] Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.008906 4778 scope.go:117] "RemoveContainer" containerID="12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.059363 4778 scope.go:117] "RemoveContainer" containerID="d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810" Mar 18 10:11:52 crc kubenswrapper[4778]: E0318 10:11:52.060334 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810\": container with ID starting with d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810 not found: ID does not exist" containerID="d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.060373 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810"} err="failed to get container status \"d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810\": rpc error: code = NotFound desc = could not find container \"d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810\": container with ID starting with d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810 not found: ID does not exist" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.060402 4778 scope.go:117] "RemoveContainer" containerID="cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2" Mar 18 10:11:52 crc kubenswrapper[4778]: E0318 10:11:52.065434 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2\": container with ID starting with cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2 not found: ID does not exist" containerID="cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.065472 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2"} err="failed to get container status \"cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2\": rpc error: code = NotFound desc = could not find container \"cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2\": container with ID starting with cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2 not found: ID does not exist" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.065493 4778 scope.go:117] "RemoveContainer" containerID="12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e" Mar 18 10:11:52 crc kubenswrapper[4778]: E0318 10:11:52.066294 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e\": container with ID starting with 12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e not found: ID does not exist" containerID="12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.066316 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e"} err="failed to get container status \"12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e\": rpc error: code = NotFound desc = could not find container \"12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e\": container with ID starting with 12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e not found: ID does not exist" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.207088 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" path="/var/lib/kubelet/pods/e86477b8-6733-463c-9c1e-3fdc3af149c8/volumes" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.148181 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.148840 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.156663 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563812-2gvxv"] Mar 18 10:12:00 crc kubenswrapper[4778]: E0318 10:12:00.157218 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="extract-utilities" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.157240 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="extract-utilities" Mar 18 10:12:00 crc kubenswrapper[4778]: E0318 10:12:00.157274 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="registry-server" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.157282 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="registry-server" Mar 18 10:12:00 crc kubenswrapper[4778]: E0318 10:12:00.157302 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="extract-content" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.157308 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="extract-content" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.157498 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="registry-server" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.158150 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.162073 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.162192 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.162981 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.167336 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563812-2gvxv"] Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.263368 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmjcx\" (UniqueName: \"kubernetes.io/projected/88f0d7a0-c27a-48d5-90f6-e7d7de946731-kube-api-access-cmjcx\") pod \"auto-csr-approver-29563812-2gvxv\" (UID: \"88f0d7a0-c27a-48d5-90f6-e7d7de946731\") " pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.365684 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmjcx\" (UniqueName: \"kubernetes.io/projected/88f0d7a0-c27a-48d5-90f6-e7d7de946731-kube-api-access-cmjcx\") pod \"auto-csr-approver-29563812-2gvxv\" (UID: \"88f0d7a0-c27a-48d5-90f6-e7d7de946731\") " pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.383386 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmjcx\" (UniqueName: \"kubernetes.io/projected/88f0d7a0-c27a-48d5-90f6-e7d7de946731-kube-api-access-cmjcx\") pod \"auto-csr-approver-29563812-2gvxv\" (UID: \"88f0d7a0-c27a-48d5-90f6-e7d7de946731\") " pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.477306 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:00 crc kubenswrapper[4778]: W0318 10:12:00.972498 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88f0d7a0_c27a_48d5_90f6_e7d7de946731.slice/crio-add7f32d41d027f207d24e12f67bedd31323fb435e1a1b972fdefbeaaf6a36ba WatchSource:0}: Error finding container add7f32d41d027f207d24e12f67bedd31323fb435e1a1b972fdefbeaaf6a36ba: Status 404 returned error can't find the container with id add7f32d41d027f207d24e12f67bedd31323fb435e1a1b972fdefbeaaf6a36ba Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.973551 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563812-2gvxv"] Mar 18 10:12:01 crc kubenswrapper[4778]: I0318 10:12:01.021923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" event={"ID":"88f0d7a0-c27a-48d5-90f6-e7d7de946731","Type":"ContainerStarted","Data":"add7f32d41d027f207d24e12f67bedd31323fb435e1a1b972fdefbeaaf6a36ba"} Mar 18 10:12:03 crc kubenswrapper[4778]: I0318 10:12:03.039683 4778 generic.go:334] "Generic (PLEG): container finished" podID="88f0d7a0-c27a-48d5-90f6-e7d7de946731" containerID="4d2240707a2956bb8da6399edaf60b6df2ea8a136aea3c9f29e332c118bf9bc2" exitCode=0 Mar 18 10:12:03 crc kubenswrapper[4778]: I0318 10:12:03.039724 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" event={"ID":"88f0d7a0-c27a-48d5-90f6-e7d7de946731","Type":"ContainerDied","Data":"4d2240707a2956bb8da6399edaf60b6df2ea8a136aea3c9f29e332c118bf9bc2"} Mar 18 10:12:04 crc kubenswrapper[4778]: I0318 10:12:04.557375 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:04 crc kubenswrapper[4778]: I0318 10:12:04.669927 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmjcx\" (UniqueName: \"kubernetes.io/projected/88f0d7a0-c27a-48d5-90f6-e7d7de946731-kube-api-access-cmjcx\") pod \"88f0d7a0-c27a-48d5-90f6-e7d7de946731\" (UID: \"88f0d7a0-c27a-48d5-90f6-e7d7de946731\") " Mar 18 10:12:04 crc kubenswrapper[4778]: I0318 10:12:04.676224 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f0d7a0-c27a-48d5-90f6-e7d7de946731-kube-api-access-cmjcx" (OuterVolumeSpecName: "kube-api-access-cmjcx") pod "88f0d7a0-c27a-48d5-90f6-e7d7de946731" (UID: "88f0d7a0-c27a-48d5-90f6-e7d7de946731"). InnerVolumeSpecName "kube-api-access-cmjcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:12:04 crc kubenswrapper[4778]: I0318 10:12:04.772784 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmjcx\" (UniqueName: \"kubernetes.io/projected/88f0d7a0-c27a-48d5-90f6-e7d7de946731-kube-api-access-cmjcx\") on node \"crc\" DevicePath \"\"" Mar 18 10:12:05 crc kubenswrapper[4778]: I0318 10:12:05.059826 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" event={"ID":"88f0d7a0-c27a-48d5-90f6-e7d7de946731","Type":"ContainerDied","Data":"add7f32d41d027f207d24e12f67bedd31323fb435e1a1b972fdefbeaaf6a36ba"} Mar 18 10:12:05 crc kubenswrapper[4778]: I0318 10:12:05.059865 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="add7f32d41d027f207d24e12f67bedd31323fb435e1a1b972fdefbeaaf6a36ba" Mar 18 10:12:05 crc kubenswrapper[4778]: I0318 10:12:05.059872 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:05 crc kubenswrapper[4778]: I0318 10:12:05.650807 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563806-n2m7x"] Mar 18 10:12:05 crc kubenswrapper[4778]: I0318 10:12:05.659713 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563806-n2m7x"] Mar 18 10:12:06 crc kubenswrapper[4778]: I0318 10:12:06.198955 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbb455c0-90cf-46a9-82c4-1c22d05e007d" path="/var/lib/kubelet/pods/dbb455c0-90cf-46a9-82c4-1c22d05e007d/volumes" Mar 18 10:12:30 crc kubenswrapper[4778]: I0318 10:12:30.147580 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:12:30 crc kubenswrapper[4778]: I0318 10:12:30.148090 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:12:34 crc kubenswrapper[4778]: I0318 10:12:34.476623 4778 scope.go:117] "RemoveContainer" containerID="6b16a1172e80d155110d49b222a6dc20e7b21d0a6a9927e8e5966e673f37ac10" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.920223 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjc8"] Mar 18 10:12:40 crc kubenswrapper[4778]: E0318 10:12:40.921064 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f0d7a0-c27a-48d5-90f6-e7d7de946731" containerName="oc" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.921075 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f0d7a0-c27a-48d5-90f6-e7d7de946731" containerName="oc" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.921261 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="88f0d7a0-c27a-48d5-90f6-e7d7de946731" containerName="oc" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.922670 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.932810 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjc8"] Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.961892 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-utilities\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.961943 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r427j\" (UniqueName: \"kubernetes.io/projected/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-kube-api-access-r427j\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.961999 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-catalog-content\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.064810 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r427j\" (UniqueName: \"kubernetes.io/projected/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-kube-api-access-r427j\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.064861 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-utilities\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.064920 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-catalog-content\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.065593 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-catalog-content\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.065680 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-utilities\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.084504 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r427j\" (UniqueName: \"kubernetes.io/projected/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-kube-api-access-r427j\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.268480 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.806816 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjc8"] Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.923092 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerStarted","Data":"2d6f09685cb2b5ee91a046b3d1408a6415873139bc173fe150fd41b97c3dca02"} Mar 18 10:12:42 crc kubenswrapper[4778]: I0318 10:12:42.938167 4778 generic.go:334] "Generic (PLEG): container finished" podID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerID="b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db" exitCode=0 Mar 18 10:12:42 crc kubenswrapper[4778]: I0318 10:12:42.938354 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerDied","Data":"b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db"} Mar 18 10:12:43 crc kubenswrapper[4778]: I0318 10:12:43.954355 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerStarted","Data":"0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2"} Mar 18 10:12:44 crc kubenswrapper[4778]: I0318 10:12:44.966428 4778 generic.go:334] "Generic (PLEG): container finished" podID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerID="0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2" exitCode=0 Mar 18 10:12:44 crc kubenswrapper[4778]: I0318 10:12:44.966715 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerDied","Data":"0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2"} Mar 18 10:12:45 crc kubenswrapper[4778]: I0318 10:12:45.978341 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerStarted","Data":"72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331"} Mar 18 10:12:46 crc kubenswrapper[4778]: I0318 10:12:46.001704 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gzjc8" podStartSLOduration=3.568313438 podStartE2EDuration="6.001685143s" podCreationTimestamp="2026-03-18 10:12:40 +0000 UTC" firstStartedPulling="2026-03-18 10:12:42.940099911 +0000 UTC m=+4229.514844751" lastFinishedPulling="2026-03-18 10:12:45.373471616 +0000 UTC m=+4231.948216456" observedRunningTime="2026-03-18 10:12:45.998434855 +0000 UTC m=+4232.573179705" watchObservedRunningTime="2026-03-18 10:12:46.001685143 +0000 UTC m=+4232.576429973" Mar 18 10:12:51 crc kubenswrapper[4778]: I0318 10:12:51.269502 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:51 crc kubenswrapper[4778]: I0318 10:12:51.271525 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:51 crc kubenswrapper[4778]: I0318 10:12:51.316224 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:52 crc kubenswrapper[4778]: I0318 10:12:52.081061 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:52 crc kubenswrapper[4778]: I0318 10:12:52.129580 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjc8"] Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.047298 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gzjc8" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="registry-server" containerID="cri-o://72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331" gracePeriod=2 Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.705139 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.780540 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-utilities\") pod \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.780754 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r427j\" (UniqueName: \"kubernetes.io/projected/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-kube-api-access-r427j\") pod \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.780833 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-catalog-content\") pod \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.783086 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-utilities" (OuterVolumeSpecName: "utilities") pod "5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" (UID: "5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.804260 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-kube-api-access-r427j" (OuterVolumeSpecName: "kube-api-access-r427j") pod "5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" (UID: "5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd"). InnerVolumeSpecName "kube-api-access-r427j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.826594 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" (UID: "5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.883343 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r427j\" (UniqueName: \"kubernetes.io/projected/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-kube-api-access-r427j\") on node \"crc\" DevicePath \"\"" Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.883380 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.883388 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.061886 4778 generic.go:334] "Generic (PLEG): container finished" podID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerID="72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331" exitCode=0 Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.061932 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerDied","Data":"72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331"} Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.061961 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerDied","Data":"2d6f09685cb2b5ee91a046b3d1408a6415873139bc173fe150fd41b97c3dca02"} Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.061975 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.061981 4778 scope.go:117] "RemoveContainer" containerID="72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.078379 4778 scope.go:117] "RemoveContainer" containerID="0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.095355 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjc8"] Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.105361 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjc8"] Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.129915 4778 scope.go:117] "RemoveContainer" containerID="b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.158622 4778 scope.go:117] "RemoveContainer" containerID="72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331" Mar 18 10:12:55 crc kubenswrapper[4778]: E0318 10:12:55.159213 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331\": container with ID starting with 72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331 not found: ID does not exist" containerID="72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.159256 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331"} err="failed to get container status \"72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331\": rpc error: code = NotFound desc = could not find container \"72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331\": container with ID starting with 72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331 not found: ID does not exist" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.159281 4778 scope.go:117] "RemoveContainer" containerID="0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2" Mar 18 10:12:55 crc kubenswrapper[4778]: E0318 10:12:55.159595 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2\": container with ID starting with 0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2 not found: ID does not exist" containerID="0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.159620 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2"} err="failed to get container status \"0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2\": rpc error: code = NotFound desc = could not find container \"0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2\": container with ID starting with 0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2 not found: ID does not exist" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.159635 4778 scope.go:117] "RemoveContainer" containerID="b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db" Mar 18 10:12:55 crc kubenswrapper[4778]: E0318 10:12:55.159846 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db\": container with ID starting with b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db not found: ID does not exist" containerID="b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.159878 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db"} err="failed to get container status \"b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db\": rpc error: code = NotFound desc = could not find container \"b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db\": container with ID starting with b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db not found: ID does not exist" Mar 18 10:12:56 crc kubenswrapper[4778]: I0318 10:12:56.203920 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" path="/var/lib/kubelet/pods/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd/volumes" Mar 18 10:13:00 crc kubenswrapper[4778]: I0318 10:13:00.148182 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:13:00 crc kubenswrapper[4778]: I0318 10:13:00.148825 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:13:00 crc kubenswrapper[4778]: I0318 10:13:00.148878 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:13:00 crc kubenswrapper[4778]: I0318 10:13:00.149810 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5e0db80372568acf47512b0578afc67f28216d99c8be88c569e735224dbd2f3"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:13:00 crc kubenswrapper[4778]: I0318 10:13:00.149889 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://d5e0db80372568acf47512b0578afc67f28216d99c8be88c569e735224dbd2f3" gracePeriod=600 Mar 18 10:13:01 crc kubenswrapper[4778]: I0318 10:13:01.124645 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="d5e0db80372568acf47512b0578afc67f28216d99c8be88c569e735224dbd2f3" exitCode=0 Mar 18 10:13:01 crc kubenswrapper[4778]: I0318 10:13:01.124683 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"d5e0db80372568acf47512b0578afc67f28216d99c8be88c569e735224dbd2f3"} Mar 18 10:13:01 crc kubenswrapper[4778]: I0318 10:13:01.125247 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce"} Mar 18 10:13:01 crc kubenswrapper[4778]: I0318 10:13:01.125272 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.140945 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563814-9vhtv"] Mar 18 10:14:00 crc kubenswrapper[4778]: E0318 10:14:00.141922 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="extract-content" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.141939 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="extract-content" Mar 18 10:14:00 crc kubenswrapper[4778]: E0318 10:14:00.141968 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="registry-server" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.141976 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="registry-server" Mar 18 10:14:00 crc kubenswrapper[4778]: E0318 10:14:00.141991 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="extract-utilities" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.141998 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="extract-utilities" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.142246 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="registry-server" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.143041 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.145566 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.145587 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.145677 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.162776 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563814-9vhtv"] Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.278342 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb8jp\" (UniqueName: \"kubernetes.io/projected/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9-kube-api-access-nb8jp\") pod \"auto-csr-approver-29563814-9vhtv\" (UID: \"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9\") " pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.380537 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb8jp\" (UniqueName: \"kubernetes.io/projected/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9-kube-api-access-nb8jp\") pod \"auto-csr-approver-29563814-9vhtv\" (UID: \"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9\") " pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.403327 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb8jp\" (UniqueName: \"kubernetes.io/projected/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9-kube-api-access-nb8jp\") pod \"auto-csr-approver-29563814-9vhtv\" (UID: \"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9\") " pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.462802 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.931180 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563814-9vhtv"] Mar 18 10:14:01 crc kubenswrapper[4778]: I0318 10:14:01.804447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" event={"ID":"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9","Type":"ContainerStarted","Data":"7cf46fde82111a0308d19b744c884483b271af70285848283badab4bdb621026"} Mar 18 10:14:02 crc kubenswrapper[4778]: I0318 10:14:02.814055 4778 generic.go:334] "Generic (PLEG): container finished" podID="96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9" containerID="75cf773e9ab812c93a6cb361a559a8ad1ac8bf63ad2f27eb51c3d0a96daa619d" exitCode=0 Mar 18 10:14:02 crc kubenswrapper[4778]: I0318 10:14:02.814151 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" event={"ID":"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9","Type":"ContainerDied","Data":"75cf773e9ab812c93a6cb361a559a8ad1ac8bf63ad2f27eb51c3d0a96daa619d"} Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.296842 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.370477 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb8jp\" (UniqueName: \"kubernetes.io/projected/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9-kube-api-access-nb8jp\") pod \"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9\" (UID: \"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9\") " Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.376526 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9-kube-api-access-nb8jp" (OuterVolumeSpecName: "kube-api-access-nb8jp") pod "96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9" (UID: "96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9"). InnerVolumeSpecName "kube-api-access-nb8jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.472377 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb8jp\" (UniqueName: \"kubernetes.io/projected/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9-kube-api-access-nb8jp\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.834342 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" event={"ID":"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9","Type":"ContainerDied","Data":"7cf46fde82111a0308d19b744c884483b271af70285848283badab4bdb621026"} Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.834397 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cf46fde82111a0308d19b744c884483b271af70285848283badab4bdb621026" Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.834518 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:05 crc kubenswrapper[4778]: I0318 10:14:05.384466 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563808-8zn6m"] Mar 18 10:14:05 crc kubenswrapper[4778]: I0318 10:14:05.394397 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563808-8zn6m"] Mar 18 10:14:06 crc kubenswrapper[4778]: I0318 10:14:06.216989 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a5c01e2-3264-47f4-8081-48235752ef32" path="/var/lib/kubelet/pods/0a5c01e2-3264-47f4-8081-48235752ef32/volumes" Mar 18 10:14:34 crc kubenswrapper[4778]: I0318 10:14:34.621679 4778 scope.go:117] "RemoveContainer" containerID="763e05415a17628575fc7b5a79a4b0a9348cfd1deec11024a98e0e749d405535" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.147503 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.149662 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.154998 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf"] Mar 18 10:15:00 crc kubenswrapper[4778]: E0318 10:15:00.155393 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9" containerName="oc" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.155411 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9" containerName="oc" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.155651 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9" containerName="oc" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.156342 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.160908 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.161471 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.169161 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf"] Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.281570 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wvrq\" (UniqueName: \"kubernetes.io/projected/1a12e64d-d433-4f42-8aa6-cd1de264b346-kube-api-access-8wvrq\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.282043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a12e64d-d433-4f42-8aa6-cd1de264b346-config-volume\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.282102 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a12e64d-d433-4f42-8aa6-cd1de264b346-secret-volume\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.383941 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a12e64d-d433-4f42-8aa6-cd1de264b346-config-volume\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.383999 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a12e64d-d433-4f42-8aa6-cd1de264b346-secret-volume\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.384050 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wvrq\" (UniqueName: \"kubernetes.io/projected/1a12e64d-d433-4f42-8aa6-cd1de264b346-kube-api-access-8wvrq\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.385401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a12e64d-d433-4f42-8aa6-cd1de264b346-config-volume\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.393069 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a12e64d-d433-4f42-8aa6-cd1de264b346-secret-volume\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.401505 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wvrq\" (UniqueName: \"kubernetes.io/projected/1a12e64d-d433-4f42-8aa6-cd1de264b346-kube-api-access-8wvrq\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.480466 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:01 crc kubenswrapper[4778]: I0318 10:15:01.008383 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf"] Mar 18 10:15:01 crc kubenswrapper[4778]: I0318 10:15:01.340766 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" event={"ID":"1a12e64d-d433-4f42-8aa6-cd1de264b346","Type":"ContainerStarted","Data":"4c308b5ed19066acb80d31a8263c3f25bb04c0935256cdfa497ae0b275b40ad3"} Mar 18 10:15:01 crc kubenswrapper[4778]: I0318 10:15:01.341173 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" event={"ID":"1a12e64d-d433-4f42-8aa6-cd1de264b346","Type":"ContainerStarted","Data":"228cb5dd40dce2c37284b46c1282d4bbad64cdc2aaf29b59be8be0c202159e59"} Mar 18 10:15:01 crc kubenswrapper[4778]: I0318 10:15:01.363368 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" podStartSLOduration=1.363345749 podStartE2EDuration="1.363345749s" podCreationTimestamp="2026-03-18 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:01.356953985 +0000 UTC m=+4367.931698835" watchObservedRunningTime="2026-03-18 10:15:01.363345749 +0000 UTC m=+4367.938090609" Mar 18 10:15:02 crc kubenswrapper[4778]: I0318 10:15:02.351281 4778 generic.go:334] "Generic (PLEG): container finished" podID="1a12e64d-d433-4f42-8aa6-cd1de264b346" containerID="4c308b5ed19066acb80d31a8263c3f25bb04c0935256cdfa497ae0b275b40ad3" exitCode=0 Mar 18 10:15:02 crc kubenswrapper[4778]: I0318 10:15:02.351379 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" event={"ID":"1a12e64d-d433-4f42-8aa6-cd1de264b346","Type":"ContainerDied","Data":"4c308b5ed19066acb80d31a8263c3f25bb04c0935256cdfa497ae0b275b40ad3"} Mar 18 10:15:03 crc kubenswrapper[4778]: I0318 10:15:03.957341 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.062640 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a12e64d-d433-4f42-8aa6-cd1de264b346-secret-volume\") pod \"1a12e64d-d433-4f42-8aa6-cd1de264b346\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.062710 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a12e64d-d433-4f42-8aa6-cd1de264b346-config-volume\") pod \"1a12e64d-d433-4f42-8aa6-cd1de264b346\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.062797 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wvrq\" (UniqueName: \"kubernetes.io/projected/1a12e64d-d433-4f42-8aa6-cd1de264b346-kube-api-access-8wvrq\") pod \"1a12e64d-d433-4f42-8aa6-cd1de264b346\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.063765 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a12e64d-d433-4f42-8aa6-cd1de264b346-config-volume" (OuterVolumeSpecName: "config-volume") pod "1a12e64d-d433-4f42-8aa6-cd1de264b346" (UID: "1a12e64d-d433-4f42-8aa6-cd1de264b346"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.069078 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a12e64d-d433-4f42-8aa6-cd1de264b346-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1a12e64d-d433-4f42-8aa6-cd1de264b346" (UID: "1a12e64d-d433-4f42-8aa6-cd1de264b346"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.069278 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a12e64d-d433-4f42-8aa6-cd1de264b346-kube-api-access-8wvrq" (OuterVolumeSpecName: "kube-api-access-8wvrq") pod "1a12e64d-d433-4f42-8aa6-cd1de264b346" (UID: "1a12e64d-d433-4f42-8aa6-cd1de264b346"). InnerVolumeSpecName "kube-api-access-8wvrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.165303 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a12e64d-d433-4f42-8aa6-cd1de264b346-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.165601 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a12e64d-d433-4f42-8aa6-cd1de264b346-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.165612 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wvrq\" (UniqueName: \"kubernetes.io/projected/1a12e64d-d433-4f42-8aa6-cd1de264b346-kube-api-access-8wvrq\") on node \"crc\" DevicePath \"\"" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.376801 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" event={"ID":"1a12e64d-d433-4f42-8aa6-cd1de264b346","Type":"ContainerDied","Data":"228cb5dd40dce2c37284b46c1282d4bbad64cdc2aaf29b59be8be0c202159e59"} Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.376850 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="228cb5dd40dce2c37284b46c1282d4bbad64cdc2aaf29b59be8be0c202159e59" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.376877 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.450655 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq"] Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.459398 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq"] Mar 18 10:15:06 crc kubenswrapper[4778]: I0318 10:15:06.198311 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b2bddb-d94d-426e-bc18-8b864785e323" path="/var/lib/kubelet/pods/b6b2bddb-d94d-426e-bc18-8b864785e323/volumes" Mar 18 10:15:30 crc kubenswrapper[4778]: I0318 10:15:30.147592 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:15:30 crc kubenswrapper[4778]: I0318 10:15:30.148306 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:15:34 crc kubenswrapper[4778]: I0318 10:15:34.702551 4778 scope.go:117] "RemoveContainer" containerID="04359ca445cb3566112d245be577eaabe4ab24e27a18fca03074e13b6e3b403f" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.147085 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.148909 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.147640 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563816-ljgbz"] Mar 18 10:16:00 crc kubenswrapper[4778]: E0318 10:16:00.149547 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a12e64d-d433-4f42-8aa6-cd1de264b346" containerName="collect-profiles" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.149581 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a12e64d-d433-4f42-8aa6-cd1de264b346" containerName="collect-profiles" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.149818 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a12e64d-d433-4f42-8aa6-cd1de264b346" containerName="collect-profiles" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.150533 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.150690 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.151475 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.151535 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" gracePeriod=600 Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.152892 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.153022 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.154276 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.166935 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563816-ljgbz"] Mar 18 10:16:00 crc kubenswrapper[4778]: E0318 10:16:00.275797 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.318598 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nll7f\" (UniqueName: \"kubernetes.io/projected/62302aba-bf34-4318-9599-2752789a925f-kube-api-access-nll7f\") pod \"auto-csr-approver-29563816-ljgbz\" (UID: \"62302aba-bf34-4318-9599-2752789a925f\") " pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.420384 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nll7f\" (UniqueName: \"kubernetes.io/projected/62302aba-bf34-4318-9599-2752789a925f-kube-api-access-nll7f\") pod \"auto-csr-approver-29563816-ljgbz\" (UID: \"62302aba-bf34-4318-9599-2752789a925f\") " pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.443014 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nll7f\" (UniqueName: \"kubernetes.io/projected/62302aba-bf34-4318-9599-2752789a925f-kube-api-access-nll7f\") pod \"auto-csr-approver-29563816-ljgbz\" (UID: \"62302aba-bf34-4318-9599-2752789a925f\") " pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.473755 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.894045 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" exitCode=0 Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.894447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce"} Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.894513 4778 scope.go:117] "RemoveContainer" containerID="d5e0db80372568acf47512b0578afc67f28216d99c8be88c569e735224dbd2f3" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.895424 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:16:00 crc kubenswrapper[4778]: E0318 10:16:00.895866 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.960328 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563816-ljgbz"] Mar 18 10:16:01 crc kubenswrapper[4778]: I0318 10:16:01.906049 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" event={"ID":"62302aba-bf34-4318-9599-2752789a925f","Type":"ContainerStarted","Data":"da2f06e59903549c6e097532a76a034c46381787a357883f630ac75141535355"} Mar 18 10:16:02 crc kubenswrapper[4778]: I0318 10:16:02.916598 4778 generic.go:334] "Generic (PLEG): container finished" podID="62302aba-bf34-4318-9599-2752789a925f" containerID="86539332f3b2bee69c9852d9c08bf1b20f84cb5d7d5b3975360dc3cdaf5134cb" exitCode=0 Mar 18 10:16:02 crc kubenswrapper[4778]: I0318 10:16:02.916655 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" event={"ID":"62302aba-bf34-4318-9599-2752789a925f","Type":"ContainerDied","Data":"86539332f3b2bee69c9852d9c08bf1b20f84cb5d7d5b3975360dc3cdaf5134cb"} Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.430439 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.507291 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nll7f\" (UniqueName: \"kubernetes.io/projected/62302aba-bf34-4318-9599-2752789a925f-kube-api-access-nll7f\") pod \"62302aba-bf34-4318-9599-2752789a925f\" (UID: \"62302aba-bf34-4318-9599-2752789a925f\") " Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.519883 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62302aba-bf34-4318-9599-2752789a925f-kube-api-access-nll7f" (OuterVolumeSpecName: "kube-api-access-nll7f") pod "62302aba-bf34-4318-9599-2752789a925f" (UID: "62302aba-bf34-4318-9599-2752789a925f"). InnerVolumeSpecName "kube-api-access-nll7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.610015 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nll7f\" (UniqueName: \"kubernetes.io/projected/62302aba-bf34-4318-9599-2752789a925f-kube-api-access-nll7f\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.934639 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" event={"ID":"62302aba-bf34-4318-9599-2752789a925f","Type":"ContainerDied","Data":"da2f06e59903549c6e097532a76a034c46381787a357883f630ac75141535355"} Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.934683 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.934686 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da2f06e59903549c6e097532a76a034c46381787a357883f630ac75141535355" Mar 18 10:16:05 crc kubenswrapper[4778]: I0318 10:16:05.513986 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563810-gbsth"] Mar 18 10:16:05 crc kubenswrapper[4778]: I0318 10:16:05.521721 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563810-gbsth"] Mar 18 10:16:06 crc kubenswrapper[4778]: I0318 10:16:06.199428 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da82992-5b46-450f-9fe2-fb1aab2e40a5" path="/var/lib/kubelet/pods/2da82992-5b46-450f-9fe2-fb1aab2e40a5/volumes" Mar 18 10:16:15 crc kubenswrapper[4778]: I0318 10:16:15.187659 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:16:15 crc kubenswrapper[4778]: E0318 10:16:15.188472 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:16:30 crc kubenswrapper[4778]: I0318 10:16:30.190291 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:16:30 crc kubenswrapper[4778]: E0318 10:16:30.191537 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:16:34 crc kubenswrapper[4778]: I0318 10:16:34.818098 4778 scope.go:117] "RemoveContainer" containerID="92b3bdd08fed961c28977d075899ad8197ec334db09149cee7b1c8a99c5b48b6" Mar 18 10:16:41 crc kubenswrapper[4778]: I0318 10:16:41.187522 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:16:41 crc kubenswrapper[4778]: E0318 10:16:41.189442 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.793059 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gb2nl"] Mar 18 10:16:49 crc kubenswrapper[4778]: E0318 10:16:49.794176 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62302aba-bf34-4318-9599-2752789a925f" containerName="oc" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.794194 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="62302aba-bf34-4318-9599-2752789a925f" containerName="oc" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.794464 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="62302aba-bf34-4318-9599-2752789a925f" containerName="oc" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.796155 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.804613 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gb2nl"] Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.942679 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-utilities\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.942850 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxtw\" (UniqueName: \"kubernetes.io/projected/be5e794e-a8d6-4d21-9456-03d0a7a34846-kube-api-access-5hxtw\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.943101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-catalog-content\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.045559 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-catalog-content\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.046012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-utilities\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.046114 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxtw\" (UniqueName: \"kubernetes.io/projected/be5e794e-a8d6-4d21-9456-03d0a7a34846-kube-api-access-5hxtw\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.046148 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-catalog-content\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.046451 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-utilities\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.191152 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxtw\" (UniqueName: \"kubernetes.io/projected/be5e794e-a8d6-4d21-9456-03d0a7a34846-kube-api-access-5hxtw\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.417638 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.907056 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gb2nl"] Mar 18 10:16:51 crc kubenswrapper[4778]: I0318 10:16:51.370621 4778 generic.go:334] "Generic (PLEG): container finished" podID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerID="796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999" exitCode=0 Mar 18 10:16:51 crc kubenswrapper[4778]: I0318 10:16:51.370820 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerDied","Data":"796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999"} Mar 18 10:16:51 crc kubenswrapper[4778]: I0318 10:16:51.370895 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerStarted","Data":"129f20c68023ea235efa0219ab242d6804e3b67294ead1332540e57fdf205fd8"} Mar 18 10:16:51 crc kubenswrapper[4778]: I0318 10:16:51.372546 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:16:53 crc kubenswrapper[4778]: I0318 10:16:53.411516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerStarted","Data":"1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3"} Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.191138 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:16:56 crc kubenswrapper[4778]: E0318 10:16:56.192658 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.589271 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8d89d"] Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.592872 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.616535 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d89d"] Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.679448 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-utilities\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.679593 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfwxg\" (UniqueName: \"kubernetes.io/projected/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-kube-api-access-sfwxg\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.679673 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-catalog-content\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.781830 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-utilities\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.781961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfwxg\" (UniqueName: \"kubernetes.io/projected/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-kube-api-access-sfwxg\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.782031 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-catalog-content\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.782304 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-utilities\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.782421 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-catalog-content\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.802451 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfwxg\" (UniqueName: \"kubernetes.io/projected/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-kube-api-access-sfwxg\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.940346 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:57 crc kubenswrapper[4778]: I0318 10:16:57.625089 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d89d"] Mar 18 10:16:58 crc kubenswrapper[4778]: I0318 10:16:58.474589 4778 generic.go:334] "Generic (PLEG): container finished" podID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerID="1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3" exitCode=0 Mar 18 10:16:58 crc kubenswrapper[4778]: I0318 10:16:58.474650 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerDied","Data":"1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3"} Mar 18 10:16:58 crc kubenswrapper[4778]: I0318 10:16:58.476365 4778 generic.go:334] "Generic (PLEG): container finished" podID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerID="16e14b18ce5d99d4bf62f995ec0986acb244f3c54b40176f880b970232b47c61" exitCode=0 Mar 18 10:16:58 crc kubenswrapper[4778]: I0318 10:16:58.476395 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerDied","Data":"16e14b18ce5d99d4bf62f995ec0986acb244f3c54b40176f880b970232b47c61"} Mar 18 10:16:58 crc kubenswrapper[4778]: I0318 10:16:58.476419 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerStarted","Data":"8676964db6e545f0e4da12fd8e93365fda854a811aa027e80da8717a823aad80"} Mar 18 10:17:00 crc kubenswrapper[4778]: I0318 10:17:00.494573 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerStarted","Data":"429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75"} Mar 18 10:17:00 crc kubenswrapper[4778]: I0318 10:17:00.498554 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerStarted","Data":"b174d9801b7aa8378e68651c33f14141cf83af4616ac0d44c4331c4c0f2958ac"} Mar 18 10:17:00 crc kubenswrapper[4778]: I0318 10:17:00.525514 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gb2nl" podStartSLOduration=3.994174304 podStartE2EDuration="11.525490246s" podCreationTimestamp="2026-03-18 10:16:49 +0000 UTC" firstStartedPulling="2026-03-18 10:16:51.37231964 +0000 UTC m=+4477.947064480" lastFinishedPulling="2026-03-18 10:16:58.903635582 +0000 UTC m=+4485.478380422" observedRunningTime="2026-03-18 10:17:00.513344497 +0000 UTC m=+4487.088089347" watchObservedRunningTime="2026-03-18 10:17:00.525490246 +0000 UTC m=+4487.100235096" Mar 18 10:17:01 crc kubenswrapper[4778]: I0318 10:17:01.509803 4778 generic.go:334] "Generic (PLEG): container finished" podID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerID="b174d9801b7aa8378e68651c33f14141cf83af4616ac0d44c4331c4c0f2958ac" exitCode=0 Mar 18 10:17:01 crc kubenswrapper[4778]: I0318 10:17:01.509865 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerDied","Data":"b174d9801b7aa8378e68651c33f14141cf83af4616ac0d44c4331c4c0f2958ac"} Mar 18 10:17:02 crc kubenswrapper[4778]: I0318 10:17:02.520355 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerStarted","Data":"1c5faed70007af88626ce40e82dc1953d592c773e281c805638237b1256535ca"} Mar 18 10:17:02 crc kubenswrapper[4778]: I0318 10:17:02.536846 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8d89d" podStartSLOduration=3.118537568 podStartE2EDuration="6.536825984s" podCreationTimestamp="2026-03-18 10:16:56 +0000 UTC" firstStartedPulling="2026-03-18 10:16:58.477424192 +0000 UTC m=+4485.052169042" lastFinishedPulling="2026-03-18 10:17:01.895712618 +0000 UTC m=+4488.470457458" observedRunningTime="2026-03-18 10:17:02.536503746 +0000 UTC m=+4489.111248606" watchObservedRunningTime="2026-03-18 10:17:02.536825984 +0000 UTC m=+4489.111570824" Mar 18 10:17:06 crc kubenswrapper[4778]: I0318 10:17:06.941136 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:17:06 crc kubenswrapper[4778]: I0318 10:17:06.941691 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:17:07 crc kubenswrapper[4778]: I0318 10:17:07.001254 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:17:07 crc kubenswrapper[4778]: I0318 10:17:07.622036 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:17:07 crc kubenswrapper[4778]: I0318 10:17:07.669086 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8d89d"] Mar 18 10:17:09 crc kubenswrapper[4778]: I0318 10:17:09.581500 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8d89d" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="registry-server" containerID="cri-o://1c5faed70007af88626ce40e82dc1953d592c773e281c805638237b1256535ca" gracePeriod=2 Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.423395 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.423775 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.600696 4778 generic.go:334] "Generic (PLEG): container finished" podID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerID="1c5faed70007af88626ce40e82dc1953d592c773e281c805638237b1256535ca" exitCode=0 Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.600744 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerDied","Data":"1c5faed70007af88626ce40e82dc1953d592c773e281c805638237b1256535ca"} Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.852103 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.972181 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-utilities\") pod \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.972501 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-catalog-content\") pod \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.972608 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfwxg\" (UniqueName: \"kubernetes.io/projected/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-kube-api-access-sfwxg\") pod \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.972714 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-utilities" (OuterVolumeSpecName: "utilities") pod "331c7aeb-0ba9-437d-b1aa-df9880d3f53d" (UID: "331c7aeb-0ba9-437d-b1aa-df9880d3f53d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.973126 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.980545 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-kube-api-access-sfwxg" (OuterVolumeSpecName: "kube-api-access-sfwxg") pod "331c7aeb-0ba9-437d-b1aa-df9880d3f53d" (UID: "331c7aeb-0ba9-437d-b1aa-df9880d3f53d"). InnerVolumeSpecName "kube-api-access-sfwxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.051659 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "331c7aeb-0ba9-437d-b1aa-df9880d3f53d" (UID: "331c7aeb-0ba9-437d-b1aa-df9880d3f53d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.075383 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.075425 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfwxg\" (UniqueName: \"kubernetes.io/projected/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-kube-api-access-sfwxg\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.187453 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:17:11 crc kubenswrapper[4778]: E0318 10:17:11.187706 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.494445 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gb2nl" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="registry-server" probeResult="failure" output=< Mar 18 10:17:11 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:17:11 crc kubenswrapper[4778]: > Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.609492 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerDied","Data":"8676964db6e545f0e4da12fd8e93365fda854a811aa027e80da8717a823aad80"} Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.609560 4778 scope.go:117] "RemoveContainer" containerID="1c5faed70007af88626ce40e82dc1953d592c773e281c805638237b1256535ca" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.610587 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.641551 4778 scope.go:117] "RemoveContainer" containerID="b174d9801b7aa8378e68651c33f14141cf83af4616ac0d44c4331c4c0f2958ac" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.643488 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8d89d"] Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.653973 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8d89d"] Mar 18 10:17:12 crc kubenswrapper[4778]: I0318 10:17:12.018566 4778 scope.go:117] "RemoveContainer" containerID="16e14b18ce5d99d4bf62f995ec0986acb244f3c54b40176f880b970232b47c61" Mar 18 10:17:12 crc kubenswrapper[4778]: I0318 10:17:12.201949 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" path="/var/lib/kubelet/pods/331c7aeb-0ba9-437d-b1aa-df9880d3f53d/volumes" Mar 18 10:17:20 crc kubenswrapper[4778]: I0318 10:17:20.481294 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:17:20 crc kubenswrapper[4778]: I0318 10:17:20.541609 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:17:20 crc kubenswrapper[4778]: I0318 10:17:20.996100 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gb2nl"] Mar 18 10:17:21 crc kubenswrapper[4778]: I0318 10:17:21.691821 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gb2nl" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="registry-server" containerID="cri-o://429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75" gracePeriod=2 Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.322689 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.408316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-catalog-content\") pod \"be5e794e-a8d6-4d21-9456-03d0a7a34846\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.408553 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hxtw\" (UniqueName: \"kubernetes.io/projected/be5e794e-a8d6-4d21-9456-03d0a7a34846-kube-api-access-5hxtw\") pod \"be5e794e-a8d6-4d21-9456-03d0a7a34846\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.408599 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-utilities\") pod \"be5e794e-a8d6-4d21-9456-03d0a7a34846\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.409798 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-utilities" (OuterVolumeSpecName: "utilities") pod "be5e794e-a8d6-4d21-9456-03d0a7a34846" (UID: "be5e794e-a8d6-4d21-9456-03d0a7a34846"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.511105 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.539500 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be5e794e-a8d6-4d21-9456-03d0a7a34846" (UID: "be5e794e-a8d6-4d21-9456-03d0a7a34846"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.613295 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.701339 4778 generic.go:334] "Generic (PLEG): container finished" podID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerID="429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75" exitCode=0 Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.701572 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerDied","Data":"429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75"} Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.702286 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerDied","Data":"129f20c68023ea235efa0219ab242d6804e3b67294ead1332540e57fdf205fd8"} Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.702365 4778 scope.go:117] "RemoveContainer" containerID="429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.701684 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.727317 4778 scope.go:117] "RemoveContainer" containerID="1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.904054 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5e794e-a8d6-4d21-9456-03d0a7a34846-kube-api-access-5hxtw" (OuterVolumeSpecName: "kube-api-access-5hxtw") pod "be5e794e-a8d6-4d21-9456-03d0a7a34846" (UID: "be5e794e-a8d6-4d21-9456-03d0a7a34846"). InnerVolumeSpecName "kube-api-access-5hxtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.919213 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hxtw\" (UniqueName: \"kubernetes.io/projected/be5e794e-a8d6-4d21-9456-03d0a7a34846-kube-api-access-5hxtw\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.925576 4778 scope.go:117] "RemoveContainer" containerID="796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.091507 4778 scope.go:117] "RemoveContainer" containerID="429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75" Mar 18 10:17:23 crc kubenswrapper[4778]: E0318 10:17:23.092051 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75\": container with ID starting with 429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75 not found: ID does not exist" containerID="429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.092102 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75"} err="failed to get container status \"429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75\": rpc error: code = NotFound desc = could not find container \"429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75\": container with ID starting with 429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75 not found: ID does not exist" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.092138 4778 scope.go:117] "RemoveContainer" containerID="1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3" Mar 18 10:17:23 crc kubenswrapper[4778]: E0318 10:17:23.092760 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3\": container with ID starting with 1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3 not found: ID does not exist" containerID="1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.092871 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3"} err="failed to get container status \"1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3\": rpc error: code = NotFound desc = could not find container \"1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3\": container with ID starting with 1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3 not found: ID does not exist" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.092963 4778 scope.go:117] "RemoveContainer" containerID="796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999" Mar 18 10:17:23 crc kubenswrapper[4778]: E0318 10:17:23.093858 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999\": container with ID starting with 796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999 not found: ID does not exist" containerID="796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.093894 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999"} err="failed to get container status \"796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999\": rpc error: code = NotFound desc = could not find container \"796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999\": container with ID starting with 796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999 not found: ID does not exist" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.244359 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gb2nl"] Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.260474 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gb2nl"] Mar 18 10:17:24 crc kubenswrapper[4778]: I0318 10:17:24.201043 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" path="/var/lib/kubelet/pods/be5e794e-a8d6-4d21-9456-03d0a7a34846/volumes" Mar 18 10:17:25 crc kubenswrapper[4778]: I0318 10:17:25.187549 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:17:25 crc kubenswrapper[4778]: E0318 10:17:25.188285 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:17:40 crc kubenswrapper[4778]: I0318 10:17:40.187111 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:17:40 crc kubenswrapper[4778]: E0318 10:17:40.187956 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:17:52 crc kubenswrapper[4778]: I0318 10:17:52.187157 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:17:52 crc kubenswrapper[4778]: E0318 10:17:52.188088 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.154773 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563818-c776t"] Mar 18 10:18:00 crc kubenswrapper[4778]: E0318 10:18:00.155609 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="registry-server" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155622 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="registry-server" Mar 18 10:18:00 crc kubenswrapper[4778]: E0318 10:18:00.155635 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="registry-server" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155641 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="registry-server" Mar 18 10:18:00 crc kubenswrapper[4778]: E0318 10:18:00.155651 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="extract-content" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155656 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="extract-content" Mar 18 10:18:00 crc kubenswrapper[4778]: E0318 10:18:00.155671 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="extract-content" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155677 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="extract-content" Mar 18 10:18:00 crc kubenswrapper[4778]: E0318 10:18:00.155690 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="extract-utilities" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155697 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="extract-utilities" Mar 18 10:18:00 crc kubenswrapper[4778]: E0318 10:18:00.155716 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="extract-utilities" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155721 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="extract-utilities" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155889 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="registry-server" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155908 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="registry-server" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.156551 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.159747 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.159823 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.159894 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.172080 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563818-c776t"] Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.196840 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d28w8\" (UniqueName: \"kubernetes.io/projected/9b16b969-ac86-4725-910d-797cd1faedc9-kube-api-access-d28w8\") pod \"auto-csr-approver-29563818-c776t\" (UID: \"9b16b969-ac86-4725-910d-797cd1faedc9\") " pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.300910 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d28w8\" (UniqueName: \"kubernetes.io/projected/9b16b969-ac86-4725-910d-797cd1faedc9-kube-api-access-d28w8\") pod \"auto-csr-approver-29563818-c776t\" (UID: \"9b16b969-ac86-4725-910d-797cd1faedc9\") " pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.320446 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d28w8\" (UniqueName: \"kubernetes.io/projected/9b16b969-ac86-4725-910d-797cd1faedc9-kube-api-access-d28w8\") pod \"auto-csr-approver-29563818-c776t\" (UID: \"9b16b969-ac86-4725-910d-797cd1faedc9\") " pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.475745 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.973398 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563818-c776t"] Mar 18 10:18:01 crc kubenswrapper[4778]: W0318 10:18:01.012306 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b16b969_ac86_4725_910d_797cd1faedc9.slice/crio-ad523834ef8921b31b24422ee6b957140ac8d58bab55ffa29e223c1c2b5d82f1 WatchSource:0}: Error finding container ad523834ef8921b31b24422ee6b957140ac8d58bab55ffa29e223c1c2b5d82f1: Status 404 returned error can't find the container with id ad523834ef8921b31b24422ee6b957140ac8d58bab55ffa29e223c1c2b5d82f1 Mar 18 10:18:01 crc kubenswrapper[4778]: I0318 10:18:01.076857 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563818-c776t" event={"ID":"9b16b969-ac86-4725-910d-797cd1faedc9","Type":"ContainerStarted","Data":"ad523834ef8921b31b24422ee6b957140ac8d58bab55ffa29e223c1c2b5d82f1"} Mar 18 10:18:03 crc kubenswrapper[4778]: I0318 10:18:03.101873 4778 generic.go:334] "Generic (PLEG): container finished" podID="9b16b969-ac86-4725-910d-797cd1faedc9" containerID="ecf9caf224b383664e625024d70d285619a974017bb319f2a60a8627b5e0d68b" exitCode=0 Mar 18 10:18:03 crc kubenswrapper[4778]: I0318 10:18:03.101963 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563818-c776t" event={"ID":"9b16b969-ac86-4725-910d-797cd1faedc9","Type":"ContainerDied","Data":"ecf9caf224b383664e625024d70d285619a974017bb319f2a60a8627b5e0d68b"} Mar 18 10:18:04 crc kubenswrapper[4778]: I0318 10:18:04.795054 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:04 crc kubenswrapper[4778]: I0318 10:18:04.901752 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d28w8\" (UniqueName: \"kubernetes.io/projected/9b16b969-ac86-4725-910d-797cd1faedc9-kube-api-access-d28w8\") pod \"9b16b969-ac86-4725-910d-797cd1faedc9\" (UID: \"9b16b969-ac86-4725-910d-797cd1faedc9\") " Mar 18 10:18:04 crc kubenswrapper[4778]: I0318 10:18:04.907603 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b16b969-ac86-4725-910d-797cd1faedc9-kube-api-access-d28w8" (OuterVolumeSpecName: "kube-api-access-d28w8") pod "9b16b969-ac86-4725-910d-797cd1faedc9" (UID: "9b16b969-ac86-4725-910d-797cd1faedc9"). InnerVolumeSpecName "kube-api-access-d28w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:18:05 crc kubenswrapper[4778]: I0318 10:18:05.003930 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d28w8\" (UniqueName: \"kubernetes.io/projected/9b16b969-ac86-4725-910d-797cd1faedc9-kube-api-access-d28w8\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:05 crc kubenswrapper[4778]: I0318 10:18:05.121137 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563818-c776t" event={"ID":"9b16b969-ac86-4725-910d-797cd1faedc9","Type":"ContainerDied","Data":"ad523834ef8921b31b24422ee6b957140ac8d58bab55ffa29e223c1c2b5d82f1"} Mar 18 10:18:05 crc kubenswrapper[4778]: I0318 10:18:05.121617 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad523834ef8921b31b24422ee6b957140ac8d58bab55ffa29e223c1c2b5d82f1" Mar 18 10:18:05 crc kubenswrapper[4778]: I0318 10:18:05.121393 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:05 crc kubenswrapper[4778]: I0318 10:18:05.864529 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563812-2gvxv"] Mar 18 10:18:05 crc kubenswrapper[4778]: I0318 10:18:05.871222 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563812-2gvxv"] Mar 18 10:18:06 crc kubenswrapper[4778]: I0318 10:18:06.197975 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88f0d7a0-c27a-48d5-90f6-e7d7de946731" path="/var/lib/kubelet/pods/88f0d7a0-c27a-48d5-90f6-e7d7de946731/volumes" Mar 18 10:18:07 crc kubenswrapper[4778]: I0318 10:18:07.188591 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:18:07 crc kubenswrapper[4778]: E0318 10:18:07.188967 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:18:20 crc kubenswrapper[4778]: I0318 10:18:20.187094 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:18:20 crc kubenswrapper[4778]: E0318 10:18:20.187998 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:18:31 crc kubenswrapper[4778]: I0318 10:18:31.186966 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:18:31 crc kubenswrapper[4778]: E0318 10:18:31.187817 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:18:34 crc kubenswrapper[4778]: I0318 10:18:34.939909 4778 scope.go:117] "RemoveContainer" containerID="4d2240707a2956bb8da6399edaf60b6df2ea8a136aea3c9f29e332c118bf9bc2" Mar 18 10:18:45 crc kubenswrapper[4778]: I0318 10:18:45.186985 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:18:45 crc kubenswrapper[4778]: E0318 10:18:45.187754 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:19:00 crc kubenswrapper[4778]: I0318 10:19:00.188225 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:19:00 crc kubenswrapper[4778]: E0318 10:19:00.189345 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:19:12 crc kubenswrapper[4778]: I0318 10:19:12.187090 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:19:12 crc kubenswrapper[4778]: E0318 10:19:12.187948 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:19:23 crc kubenswrapper[4778]: I0318 10:19:23.188213 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:19:23 crc kubenswrapper[4778]: E0318 10:19:23.189013 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:19:36 crc kubenswrapper[4778]: I0318 10:19:36.187032 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:19:36 crc kubenswrapper[4778]: E0318 10:19:36.188020 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:19:48 crc kubenswrapper[4778]: I0318 10:19:48.187328 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:19:48 crc kubenswrapper[4778]: E0318 10:19:48.188046 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.144906 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563820-xjqs9"] Mar 18 10:20:00 crc kubenswrapper[4778]: E0318 10:20:00.146370 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b16b969-ac86-4725-910d-797cd1faedc9" containerName="oc" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.146392 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b16b969-ac86-4725-910d-797cd1faedc9" containerName="oc" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.146610 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b16b969-ac86-4725-910d-797cd1faedc9" containerName="oc" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.147479 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.150298 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.150372 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.151889 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.157119 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563820-xjqs9"] Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.304115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8km7c\" (UniqueName: \"kubernetes.io/projected/0b4b190c-f80e-4256-9025-f04279c3b3db-kube-api-access-8km7c\") pod \"auto-csr-approver-29563820-xjqs9\" (UID: \"0b4b190c-f80e-4256-9025-f04279c3b3db\") " pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.406403 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8km7c\" (UniqueName: \"kubernetes.io/projected/0b4b190c-f80e-4256-9025-f04279c3b3db-kube-api-access-8km7c\") pod \"auto-csr-approver-29563820-xjqs9\" (UID: \"0b4b190c-f80e-4256-9025-f04279c3b3db\") " pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.427902 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8km7c\" (UniqueName: \"kubernetes.io/projected/0b4b190c-f80e-4256-9025-f04279c3b3db-kube-api-access-8km7c\") pod \"auto-csr-approver-29563820-xjqs9\" (UID: \"0b4b190c-f80e-4256-9025-f04279c3b3db\") " pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.466023 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.926735 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563820-xjqs9"] Mar 18 10:20:01 crc kubenswrapper[4778]: I0318 10:20:01.177322 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" event={"ID":"0b4b190c-f80e-4256-9025-f04279c3b3db","Type":"ContainerStarted","Data":"0ebb7dc41081892e638316fbe1956fb7acd1ee464e83067dd5b224734c60d953"} Mar 18 10:20:02 crc kubenswrapper[4778]: I0318 10:20:02.185521 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" event={"ID":"0b4b190c-f80e-4256-9025-f04279c3b3db","Type":"ContainerStarted","Data":"f18fd314b43b902914cecf41007b7ceab63380321a7a2e1eeb69b1d4c6a07a98"} Mar 18 10:20:02 crc kubenswrapper[4778]: I0318 10:20:02.220937 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" podStartSLOduration=1.331802742 podStartE2EDuration="2.220913156s" podCreationTimestamp="2026-03-18 10:20:00 +0000 UTC" firstStartedPulling="2026-03-18 10:20:00.934816646 +0000 UTC m=+4667.509561486" lastFinishedPulling="2026-03-18 10:20:01.82392706 +0000 UTC m=+4668.398671900" observedRunningTime="2026-03-18 10:20:02.20594245 +0000 UTC m=+4668.780687310" watchObservedRunningTime="2026-03-18 10:20:02.220913156 +0000 UTC m=+4668.795657996" Mar 18 10:20:03 crc kubenswrapper[4778]: I0318 10:20:03.187116 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:20:03 crc kubenswrapper[4778]: E0318 10:20:03.187806 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:20:03 crc kubenswrapper[4778]: I0318 10:20:03.214187 4778 generic.go:334] "Generic (PLEG): container finished" podID="0b4b190c-f80e-4256-9025-f04279c3b3db" containerID="f18fd314b43b902914cecf41007b7ceab63380321a7a2e1eeb69b1d4c6a07a98" exitCode=0 Mar 18 10:20:03 crc kubenswrapper[4778]: I0318 10:20:03.214291 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" event={"ID":"0b4b190c-f80e-4256-9025-f04279c3b3db","Type":"ContainerDied","Data":"f18fd314b43b902914cecf41007b7ceab63380321a7a2e1eeb69b1d4c6a07a98"} Mar 18 10:20:04 crc kubenswrapper[4778]: I0318 10:20:04.787441 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:04 crc kubenswrapper[4778]: I0318 10:20:04.895990 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8km7c\" (UniqueName: \"kubernetes.io/projected/0b4b190c-f80e-4256-9025-f04279c3b3db-kube-api-access-8km7c\") pod \"0b4b190c-f80e-4256-9025-f04279c3b3db\" (UID: \"0b4b190c-f80e-4256-9025-f04279c3b3db\") " Mar 18 10:20:04 crc kubenswrapper[4778]: I0318 10:20:04.909615 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b4b190c-f80e-4256-9025-f04279c3b3db-kube-api-access-8km7c" (OuterVolumeSpecName: "kube-api-access-8km7c") pod "0b4b190c-f80e-4256-9025-f04279c3b3db" (UID: "0b4b190c-f80e-4256-9025-f04279c3b3db"). InnerVolumeSpecName "kube-api-access-8km7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:20:04 crc kubenswrapper[4778]: I0318 10:20:04.998664 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8km7c\" (UniqueName: \"kubernetes.io/projected/0b4b190c-f80e-4256-9025-f04279c3b3db-kube-api-access-8km7c\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:05 crc kubenswrapper[4778]: I0318 10:20:05.231783 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" event={"ID":"0b4b190c-f80e-4256-9025-f04279c3b3db","Type":"ContainerDied","Data":"0ebb7dc41081892e638316fbe1956fb7acd1ee464e83067dd5b224734c60d953"} Mar 18 10:20:05 crc kubenswrapper[4778]: I0318 10:20:05.232086 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ebb7dc41081892e638316fbe1956fb7acd1ee464e83067dd5b224734c60d953" Mar 18 10:20:05 crc kubenswrapper[4778]: I0318 10:20:05.232153 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:05 crc kubenswrapper[4778]: I0318 10:20:05.283125 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563814-9vhtv"] Mar 18 10:20:05 crc kubenswrapper[4778]: I0318 10:20:05.292300 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563814-9vhtv"] Mar 18 10:20:06 crc kubenswrapper[4778]: I0318 10:20:06.196809 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9" path="/var/lib/kubelet/pods/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9/volumes" Mar 18 10:20:17 crc kubenswrapper[4778]: I0318 10:20:17.187381 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:20:17 crc kubenswrapper[4778]: E0318 10:20:17.190417 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:20:32 crc kubenswrapper[4778]: I0318 10:20:32.187829 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:20:32 crc kubenswrapper[4778]: E0318 10:20:32.188489 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:20:35 crc kubenswrapper[4778]: I0318 10:20:35.024650 4778 scope.go:117] "RemoveContainer" containerID="75cf773e9ab812c93a6cb361a559a8ad1ac8bf63ad2f27eb51c3d0a96daa619d" Mar 18 10:20:43 crc kubenswrapper[4778]: I0318 10:20:43.187161 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:20:43 crc kubenswrapper[4778]: E0318 10:20:43.187945 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:20:54 crc kubenswrapper[4778]: I0318 10:20:54.194752 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:20:54 crc kubenswrapper[4778]: E0318 10:20:54.196153 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:21:07 crc kubenswrapper[4778]: I0318 10:21:07.187752 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:21:07 crc kubenswrapper[4778]: I0318 10:21:07.785548 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"4b8f66334209f29f6c78b08a10e261ea310c6e7d034012d49ee4adbb016d4215"} Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.076937 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-98lf9"] Mar 18 10:21:51 crc kubenswrapper[4778]: E0318 10:21:51.078008 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b4b190c-f80e-4256-9025-f04279c3b3db" containerName="oc" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.078027 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b4b190c-f80e-4256-9025-f04279c3b3db" containerName="oc" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.078308 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b4b190c-f80e-4256-9025-f04279c3b3db" containerName="oc" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.079986 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.110458 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-98lf9"] Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.148448 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-catalog-content\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.148509 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-utilities\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.148532 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxdj\" (UniqueName: \"kubernetes.io/projected/259d9d60-84b8-48a1-844f-734126616467-kube-api-access-zrxdj\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.250624 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-catalog-content\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.250664 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-utilities\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.250683 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxdj\" (UniqueName: \"kubernetes.io/projected/259d9d60-84b8-48a1-844f-734126616467-kube-api-access-zrxdj\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.251267 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-catalog-content\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.251417 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-utilities\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.275107 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxdj\" (UniqueName: \"kubernetes.io/projected/259d9d60-84b8-48a1-844f-734126616467-kube-api-access-zrxdj\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.422941 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.939875 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-98lf9"] Mar 18 10:21:52 crc kubenswrapper[4778]: I0318 10:21:52.172729 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerStarted","Data":"8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87"} Mar 18 10:21:52 crc kubenswrapper[4778]: I0318 10:21:52.172793 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerStarted","Data":"0f2d7eade11a0d10668484020d0ce0c1eaf35772fb28ce8aca1004ed1ac02bb6"} Mar 18 10:21:53 crc kubenswrapper[4778]: I0318 10:21:53.181723 4778 generic.go:334] "Generic (PLEG): container finished" podID="259d9d60-84b8-48a1-844f-734126616467" containerID="8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87" exitCode=0 Mar 18 10:21:53 crc kubenswrapper[4778]: I0318 10:21:53.181817 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerDied","Data":"8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87"} Mar 18 10:21:53 crc kubenswrapper[4778]: I0318 10:21:53.184954 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:21:55 crc kubenswrapper[4778]: I0318 10:21:55.203223 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerStarted","Data":"d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436"} Mar 18 10:21:56 crc kubenswrapper[4778]: E0318 10:21:56.656594 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod259d9d60_84b8_48a1_844f_734126616467.slice/crio-d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod259d9d60_84b8_48a1_844f_734126616467.slice/crio-conmon-d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436.scope\": RecentStats: unable to find data in memory cache]" Mar 18 10:21:57 crc kubenswrapper[4778]: I0318 10:21:57.221795 4778 generic.go:334] "Generic (PLEG): container finished" podID="259d9d60-84b8-48a1-844f-734126616467" containerID="d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436" exitCode=0 Mar 18 10:21:57 crc kubenswrapper[4778]: I0318 10:21:57.222065 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerDied","Data":"d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436"} Mar 18 10:21:58 crc kubenswrapper[4778]: I0318 10:21:58.232592 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerStarted","Data":"0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b"} Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.149289 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-98lf9" podStartSLOduration=4.723572277 podStartE2EDuration="9.149271344s" podCreationTimestamp="2026-03-18 10:21:51 +0000 UTC" firstStartedPulling="2026-03-18 10:21:53.184741534 +0000 UTC m=+4779.759486374" lastFinishedPulling="2026-03-18 10:21:57.610440591 +0000 UTC m=+4784.185185441" observedRunningTime="2026-03-18 10:21:58.24943365 +0000 UTC m=+4784.824178510" watchObservedRunningTime="2026-03-18 10:22:00.149271344 +0000 UTC m=+4786.724016184" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.156226 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563822-hg65s"] Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.157832 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.161692 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.162096 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.162276 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.165149 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563822-hg65s"] Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.234742 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfspn\" (UniqueName: \"kubernetes.io/projected/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8-kube-api-access-wfspn\") pod \"auto-csr-approver-29563822-hg65s\" (UID: \"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8\") " pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.336014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfspn\" (UniqueName: \"kubernetes.io/projected/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8-kube-api-access-wfspn\") pod \"auto-csr-approver-29563822-hg65s\" (UID: \"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8\") " pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.355869 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfspn\" (UniqueName: \"kubernetes.io/projected/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8-kube-api-access-wfspn\") pod \"auto-csr-approver-29563822-hg65s\" (UID: \"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8\") " pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.481831 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.931928 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563822-hg65s"] Mar 18 10:22:01 crc kubenswrapper[4778]: I0318 10:22:01.260020 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563822-hg65s" event={"ID":"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8","Type":"ContainerStarted","Data":"322fe25189f2d4f140f8c79b8a68cba0920aa3555453f9a2af1fd68c796840fa"} Mar 18 10:22:01 crc kubenswrapper[4778]: I0318 10:22:01.424386 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:22:01 crc kubenswrapper[4778]: I0318 10:22:01.424463 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:22:01 crc kubenswrapper[4778]: I0318 10:22:01.474657 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:22:02 crc kubenswrapper[4778]: I0318 10:22:02.323651 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:22:02 crc kubenswrapper[4778]: I0318 10:22:02.384309 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-98lf9"] Mar 18 10:22:03 crc kubenswrapper[4778]: I0318 10:22:03.281920 4778 generic.go:334] "Generic (PLEG): container finished" podID="b71bd8a1-ed53-4e72-8316-7bf3774ee1d8" containerID="23fd2e1aeae3db5eb1358b08c80f32e6a7b3195affe83ad0ba4169976ea65d12" exitCode=0 Mar 18 10:22:03 crc kubenswrapper[4778]: I0318 10:22:03.282000 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563822-hg65s" event={"ID":"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8","Type":"ContainerDied","Data":"23fd2e1aeae3db5eb1358b08c80f32e6a7b3195affe83ad0ba4169976ea65d12"} Mar 18 10:22:04 crc kubenswrapper[4778]: I0318 10:22:04.308330 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-98lf9" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="registry-server" containerID="cri-o://0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b" gracePeriod=2 Mar 18 10:22:04 crc kubenswrapper[4778]: I0318 10:22:04.819715 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:04 crc kubenswrapper[4778]: I0318 10:22:04.934342 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfspn\" (UniqueName: \"kubernetes.io/projected/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8-kube-api-access-wfspn\") pod \"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8\" (UID: \"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8\") " Mar 18 10:22:04 crc kubenswrapper[4778]: I0318 10:22:04.988937 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8-kube-api-access-wfspn" (OuterVolumeSpecName: "kube-api-access-wfspn") pod "b71bd8a1-ed53-4e72-8316-7bf3774ee1d8" (UID: "b71bd8a1-ed53-4e72-8316-7bf3774ee1d8"). InnerVolumeSpecName "kube-api-access-wfspn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.036636 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfspn\" (UniqueName: \"kubernetes.io/projected/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8-kube-api-access-wfspn\") on node \"crc\" DevicePath \"\"" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.104013 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.241046 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrxdj\" (UniqueName: \"kubernetes.io/projected/259d9d60-84b8-48a1-844f-734126616467-kube-api-access-zrxdj\") pod \"259d9d60-84b8-48a1-844f-734126616467\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.241409 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-utilities\") pod \"259d9d60-84b8-48a1-844f-734126616467\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.241531 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-catalog-content\") pod \"259d9d60-84b8-48a1-844f-734126616467\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.242163 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-utilities" (OuterVolumeSpecName: "utilities") pod "259d9d60-84b8-48a1-844f-734126616467" (UID: "259d9d60-84b8-48a1-844f-734126616467"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.245649 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/259d9d60-84b8-48a1-844f-734126616467-kube-api-access-zrxdj" (OuterVolumeSpecName: "kube-api-access-zrxdj") pod "259d9d60-84b8-48a1-844f-734126616467" (UID: "259d9d60-84b8-48a1-844f-734126616467"). InnerVolumeSpecName "kube-api-access-zrxdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.287281 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "259d9d60-84b8-48a1-844f-734126616467" (UID: "259d9d60-84b8-48a1-844f-734126616467"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.318902 4778 generic.go:334] "Generic (PLEG): container finished" podID="259d9d60-84b8-48a1-844f-734126616467" containerID="0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b" exitCode=0 Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.318972 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerDied","Data":"0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b"} Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.319007 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerDied","Data":"0f2d7eade11a0d10668484020d0ce0c1eaf35772fb28ce8aca1004ed1ac02bb6"} Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.319032 4778 scope.go:117] "RemoveContainer" containerID="0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.319179 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.327730 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563822-hg65s" event={"ID":"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8","Type":"ContainerDied","Data":"322fe25189f2d4f140f8c79b8a68cba0920aa3555453f9a2af1fd68c796840fa"} Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.327864 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="322fe25189f2d4f140f8c79b8a68cba0920aa3555453f9a2af1fd68c796840fa" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.327932 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.344746 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrxdj\" (UniqueName: \"kubernetes.io/projected/259d9d60-84b8-48a1-844f-734126616467-kube-api-access-zrxdj\") on node \"crc\" DevicePath \"\"" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.344776 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.344785 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.356224 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-98lf9"] Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.357968 4778 scope.go:117] "RemoveContainer" containerID="d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.368107 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-98lf9"] Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.381753 4778 scope.go:117] "RemoveContainer" containerID="8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.399908 4778 scope.go:117] "RemoveContainer" containerID="0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b" Mar 18 10:22:05 crc kubenswrapper[4778]: E0318 10:22:05.400358 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b\": container with ID starting with 0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b not found: ID does not exist" containerID="0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.400406 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b"} err="failed to get container status \"0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b\": rpc error: code = NotFound desc = could not find container \"0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b\": container with ID starting with 0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b not found: ID does not exist" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.400428 4778 scope.go:117] "RemoveContainer" containerID="d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436" Mar 18 10:22:05 crc kubenswrapper[4778]: E0318 10:22:05.400790 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436\": container with ID starting with d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436 not found: ID does not exist" containerID="d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.400845 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436"} err="failed to get container status \"d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436\": rpc error: code = NotFound desc = could not find container \"d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436\": container with ID starting with d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436 not found: ID does not exist" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.400860 4778 scope.go:117] "RemoveContainer" containerID="8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87" Mar 18 10:22:05 crc kubenswrapper[4778]: E0318 10:22:05.401098 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87\": container with ID starting with 8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87 not found: ID does not exist" containerID="8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.401145 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87"} err="failed to get container status \"8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87\": rpc error: code = NotFound desc = could not find container \"8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87\": container with ID starting with 8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87 not found: ID does not exist" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.897738 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563816-ljgbz"] Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.911775 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563816-ljgbz"] Mar 18 10:22:06 crc kubenswrapper[4778]: I0318 10:22:06.198957 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="259d9d60-84b8-48a1-844f-734126616467" path="/var/lib/kubelet/pods/259d9d60-84b8-48a1-844f-734126616467/volumes" Mar 18 10:22:06 crc kubenswrapper[4778]: I0318 10:22:06.199937 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62302aba-bf34-4318-9599-2752789a925f" path="/var/lib/kubelet/pods/62302aba-bf34-4318-9599-2752789a925f/volumes" Mar 18 10:22:35 crc kubenswrapper[4778]: I0318 10:22:35.115758 4778 scope.go:117] "RemoveContainer" containerID="86539332f3b2bee69c9852d9c08bf1b20f84cb5d7d5b3975360dc3cdaf5134cb" Mar 18 10:23:30 crc kubenswrapper[4778]: I0318 10:23:30.147340 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:23:30 crc kubenswrapper[4778]: I0318 10:23:30.147953 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.142756 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563824-cgc65"] Mar 18 10:24:00 crc kubenswrapper[4778]: E0318 10:24:00.143963 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="extract-content" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.143986 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="extract-content" Mar 18 10:24:00 crc kubenswrapper[4778]: E0318 10:24:00.143999 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="registry-server" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.144007 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="registry-server" Mar 18 10:24:00 crc kubenswrapper[4778]: E0318 10:24:00.144035 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71bd8a1-ed53-4e72-8316-7bf3774ee1d8" containerName="oc" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.144044 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71bd8a1-ed53-4e72-8316-7bf3774ee1d8" containerName="oc" Mar 18 10:24:00 crc kubenswrapper[4778]: E0318 10:24:00.144075 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="extract-utilities" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.144084 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="extract-utilities" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.144381 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="registry-server" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.144411 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b71bd8a1-ed53-4e72-8316-7bf3774ee1d8" containerName="oc" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.145529 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.147575 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.147635 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.148005 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.148821 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.149155 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.156044 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563824-cgc65"] Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.296835 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74rgk\" (UniqueName: \"kubernetes.io/projected/a10aa4ea-573d-4956-953f-4bdef827448d-kube-api-access-74rgk\") pod \"auto-csr-approver-29563824-cgc65\" (UID: \"a10aa4ea-573d-4956-953f-4bdef827448d\") " pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.399137 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74rgk\" (UniqueName: \"kubernetes.io/projected/a10aa4ea-573d-4956-953f-4bdef827448d-kube-api-access-74rgk\") pod \"auto-csr-approver-29563824-cgc65\" (UID: \"a10aa4ea-573d-4956-953f-4bdef827448d\") " pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.432673 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74rgk\" (UniqueName: \"kubernetes.io/projected/a10aa4ea-573d-4956-953f-4bdef827448d-kube-api-access-74rgk\") pod \"auto-csr-approver-29563824-cgc65\" (UID: \"a10aa4ea-573d-4956-953f-4bdef827448d\") " pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.465463 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.976365 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563824-cgc65"] Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.036183 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5h69g"] Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.038665 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.048481 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5h69g"] Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.214319 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-catalog-content\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.214698 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-utilities\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.214755 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74tfk\" (UniqueName: \"kubernetes.io/projected/2ac74550-1228-4f02-a1fc-9816bc63eb22-kube-api-access-74tfk\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.316337 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-utilities\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.316381 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74tfk\" (UniqueName: \"kubernetes.io/projected/2ac74550-1228-4f02-a1fc-9816bc63eb22-kube-api-access-74tfk\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.316497 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-catalog-content\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.317322 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-utilities\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.317370 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-catalog-content\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.337676 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74tfk\" (UniqueName: \"kubernetes.io/projected/2ac74550-1228-4f02-a1fc-9816bc63eb22-kube-api-access-74tfk\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.375907 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.496620 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563824-cgc65" event={"ID":"a10aa4ea-573d-4956-953f-4bdef827448d","Type":"ContainerStarted","Data":"9adcc651841abf3955ab90d6ae894f7a37101852ea93099acf016628a334ccef"} Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.836259 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5h69g"] Mar 18 10:24:02 crc kubenswrapper[4778]: I0318 10:24:02.507363 4778 generic.go:334] "Generic (PLEG): container finished" podID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerID="cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304" exitCode=0 Mar 18 10:24:02 crc kubenswrapper[4778]: I0318 10:24:02.507460 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5h69g" event={"ID":"2ac74550-1228-4f02-a1fc-9816bc63eb22","Type":"ContainerDied","Data":"cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304"} Mar 18 10:24:02 crc kubenswrapper[4778]: I0318 10:24:02.507740 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5h69g" event={"ID":"2ac74550-1228-4f02-a1fc-9816bc63eb22","Type":"ContainerStarted","Data":"9ffd33c3c22cf4a977b61b12575074605267705c1e086f0ed6e287067b8b808d"} Mar 18 10:24:03 crc kubenswrapper[4778]: I0318 10:24:03.518587 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563824-cgc65" event={"ID":"a10aa4ea-573d-4956-953f-4bdef827448d","Type":"ContainerStarted","Data":"5830290d694881dc2acd7c8637c4816f24221da7b703db7749adb1ec9ce95a1b"} Mar 18 10:24:03 crc kubenswrapper[4778]: I0318 10:24:03.541044 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563824-cgc65" podStartSLOduration=2.366704243 podStartE2EDuration="3.541023841s" podCreationTimestamp="2026-03-18 10:24:00 +0000 UTC" firstStartedPulling="2026-03-18 10:24:01.302359297 +0000 UTC m=+4907.877104147" lastFinishedPulling="2026-03-18 10:24:02.476678895 +0000 UTC m=+4909.051423745" observedRunningTime="2026-03-18 10:24:03.538765539 +0000 UTC m=+4910.113510399" watchObservedRunningTime="2026-03-18 10:24:03.541023841 +0000 UTC m=+4910.115768681" Mar 18 10:24:04 crc kubenswrapper[4778]: I0318 10:24:04.528624 4778 generic.go:334] "Generic (PLEG): container finished" podID="a10aa4ea-573d-4956-953f-4bdef827448d" containerID="5830290d694881dc2acd7c8637c4816f24221da7b703db7749adb1ec9ce95a1b" exitCode=0 Mar 18 10:24:04 crc kubenswrapper[4778]: I0318 10:24:04.528853 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563824-cgc65" event={"ID":"a10aa4ea-573d-4956-953f-4bdef827448d","Type":"ContainerDied","Data":"5830290d694881dc2acd7c8637c4816f24221da7b703db7749adb1ec9ce95a1b"} Mar 18 10:24:04 crc kubenswrapper[4778]: I0318 10:24:04.530763 4778 generic.go:334] "Generic (PLEG): container finished" podID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerID="58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a" exitCode=0 Mar 18 10:24:04 crc kubenswrapper[4778]: I0318 10:24:04.530798 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5h69g" event={"ID":"2ac74550-1228-4f02-a1fc-9816bc63eb22","Type":"ContainerDied","Data":"58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a"} Mar 18 10:24:05 crc kubenswrapper[4778]: I0318 10:24:05.540844 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5h69g" event={"ID":"2ac74550-1228-4f02-a1fc-9816bc63eb22","Type":"ContainerStarted","Data":"98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383"} Mar 18 10:24:05 crc kubenswrapper[4778]: I0318 10:24:05.570804 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5h69g" podStartSLOduration=2.021902121 podStartE2EDuration="4.570783798s" podCreationTimestamp="2026-03-18 10:24:01 +0000 UTC" firstStartedPulling="2026-03-18 10:24:02.509041082 +0000 UTC m=+4909.083785932" lastFinishedPulling="2026-03-18 10:24:05.057922769 +0000 UTC m=+4911.632667609" observedRunningTime="2026-03-18 10:24:05.563019448 +0000 UTC m=+4912.137764288" watchObservedRunningTime="2026-03-18 10:24:05.570783798 +0000 UTC m=+4912.145528638" Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.040236 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.223877 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74rgk\" (UniqueName: \"kubernetes.io/projected/a10aa4ea-573d-4956-953f-4bdef827448d-kube-api-access-74rgk\") pod \"a10aa4ea-573d-4956-953f-4bdef827448d\" (UID: \"a10aa4ea-573d-4956-953f-4bdef827448d\") " Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.228819 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10aa4ea-573d-4956-953f-4bdef827448d-kube-api-access-74rgk" (OuterVolumeSpecName: "kube-api-access-74rgk") pod "a10aa4ea-573d-4956-953f-4bdef827448d" (UID: "a10aa4ea-573d-4956-953f-4bdef827448d"). InnerVolumeSpecName "kube-api-access-74rgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.326906 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74rgk\" (UniqueName: \"kubernetes.io/projected/a10aa4ea-573d-4956-953f-4bdef827448d-kube-api-access-74rgk\") on node \"crc\" DevicePath \"\"" Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.550537 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563824-cgc65" event={"ID":"a10aa4ea-573d-4956-953f-4bdef827448d","Type":"ContainerDied","Data":"9adcc651841abf3955ab90d6ae894f7a37101852ea93099acf016628a334ccef"} Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.550599 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9adcc651841abf3955ab90d6ae894f7a37101852ea93099acf016628a334ccef" Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.550556 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.609059 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563818-c776t"] Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.616989 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563818-c776t"] Mar 18 10:24:08 crc kubenswrapper[4778]: I0318 10:24:08.201814 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b16b969-ac86-4725-910d-797cd1faedc9" path="/var/lib/kubelet/pods/9b16b969-ac86-4725-910d-797cd1faedc9/volumes" Mar 18 10:24:11 crc kubenswrapper[4778]: I0318 10:24:11.376816 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:11 crc kubenswrapper[4778]: I0318 10:24:11.377070 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:11 crc kubenswrapper[4778]: I0318 10:24:11.425679 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:11 crc kubenswrapper[4778]: I0318 10:24:11.663553 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:12 crc kubenswrapper[4778]: I0318 10:24:12.808015 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5h69g"] Mar 18 10:24:13 crc kubenswrapper[4778]: I0318 10:24:13.631285 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5h69g" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="registry-server" containerID="cri-o://98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383" gracePeriod=2 Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.605926 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.643524 4778 generic.go:334] "Generic (PLEG): container finished" podID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerID="98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383" exitCode=0 Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.643562 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5h69g" event={"ID":"2ac74550-1228-4f02-a1fc-9816bc63eb22","Type":"ContainerDied","Data":"98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383"} Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.643589 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5h69g" event={"ID":"2ac74550-1228-4f02-a1fc-9816bc63eb22","Type":"ContainerDied","Data":"9ffd33c3c22cf4a977b61b12575074605267705c1e086f0ed6e287067b8b808d"} Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.643605 4778 scope.go:117] "RemoveContainer" containerID="98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.643718 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.673983 4778 scope.go:117] "RemoveContainer" containerID="58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.692019 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74tfk\" (UniqueName: \"kubernetes.io/projected/2ac74550-1228-4f02-a1fc-9816bc63eb22-kube-api-access-74tfk\") pod \"2ac74550-1228-4f02-a1fc-9816bc63eb22\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.692317 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-catalog-content\") pod \"2ac74550-1228-4f02-a1fc-9816bc63eb22\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.692359 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-utilities\") pod \"2ac74550-1228-4f02-a1fc-9816bc63eb22\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.693718 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-utilities" (OuterVolumeSpecName: "utilities") pod "2ac74550-1228-4f02-a1fc-9816bc63eb22" (UID: "2ac74550-1228-4f02-a1fc-9816bc63eb22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.701600 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac74550-1228-4f02-a1fc-9816bc63eb22-kube-api-access-74tfk" (OuterVolumeSpecName: "kube-api-access-74tfk") pod "2ac74550-1228-4f02-a1fc-9816bc63eb22" (UID: "2ac74550-1228-4f02-a1fc-9816bc63eb22"). InnerVolumeSpecName "kube-api-access-74tfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.715234 4778 scope.go:117] "RemoveContainer" containerID="cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.790782 4778 scope.go:117] "RemoveContainer" containerID="98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383" Mar 18 10:24:14 crc kubenswrapper[4778]: E0318 10:24:14.791678 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383\": container with ID starting with 98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383 not found: ID does not exist" containerID="98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.791711 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383"} err="failed to get container status \"98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383\": rpc error: code = NotFound desc = could not find container \"98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383\": container with ID starting with 98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383 not found: ID does not exist" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.791738 4778 scope.go:117] "RemoveContainer" containerID="58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a" Mar 18 10:24:14 crc kubenswrapper[4778]: E0318 10:24:14.792162 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a\": container with ID starting with 58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a not found: ID does not exist" containerID="58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.792215 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a"} err="failed to get container status \"58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a\": rpc error: code = NotFound desc = could not find container \"58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a\": container with ID starting with 58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a not found: ID does not exist" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.792241 4778 scope.go:117] "RemoveContainer" containerID="cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304" Mar 18 10:24:14 crc kubenswrapper[4778]: E0318 10:24:14.792494 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304\": container with ID starting with cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304 not found: ID does not exist" containerID="cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.792546 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304"} err="failed to get container status \"cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304\": rpc error: code = NotFound desc = could not find container \"cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304\": container with ID starting with cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304 not found: ID does not exist" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.795547 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.795595 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74tfk\" (UniqueName: \"kubernetes.io/projected/2ac74550-1228-4f02-a1fc-9816bc63eb22-kube-api-access-74tfk\") on node \"crc\" DevicePath \"\"" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.971950 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ac74550-1228-4f02-a1fc-9816bc63eb22" (UID: "2ac74550-1228-4f02-a1fc-9816bc63eb22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.999190 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:24:15 crc kubenswrapper[4778]: I0318 10:24:15.295391 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5h69g"] Mar 18 10:24:15 crc kubenswrapper[4778]: I0318 10:24:15.309243 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5h69g"] Mar 18 10:24:16 crc kubenswrapper[4778]: I0318 10:24:16.219027 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" path="/var/lib/kubelet/pods/2ac74550-1228-4f02-a1fc-9816bc63eb22/volumes" Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.147118 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.147608 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.147649 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.148402 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b8f66334209f29f6c78b08a10e261ea310c6e7d034012d49ee4adbb016d4215"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.148462 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://4b8f66334209f29f6c78b08a10e261ea310c6e7d034012d49ee4adbb016d4215" gracePeriod=600 Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.791386 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="4b8f66334209f29f6c78b08a10e261ea310c6e7d034012d49ee4adbb016d4215" exitCode=0 Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.791459 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"4b8f66334209f29f6c78b08a10e261ea310c6e7d034012d49ee4adbb016d4215"} Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.791851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36"} Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.791877 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:24:35 crc kubenswrapper[4778]: I0318 10:24:35.249186 4778 scope.go:117] "RemoveContainer" containerID="ecf9caf224b383664e625024d70d285619a974017bb319f2a60a8627b5e0d68b" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.151185 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563826-w6vsk"] Mar 18 10:26:00 crc kubenswrapper[4778]: E0318 10:26:00.152063 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="extract-utilities" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.152075 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="extract-utilities" Mar 18 10:26:00 crc kubenswrapper[4778]: E0318 10:26:00.152094 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10aa4ea-573d-4956-953f-4bdef827448d" containerName="oc" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.152099 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10aa4ea-573d-4956-953f-4bdef827448d" containerName="oc" Mar 18 10:26:00 crc kubenswrapper[4778]: E0318 10:26:00.152120 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="registry-server" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.152127 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="registry-server" Mar 18 10:26:00 crc kubenswrapper[4778]: E0318 10:26:00.152142 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="extract-content" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.152151 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="extract-content" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.152349 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a10aa4ea-573d-4956-953f-4bdef827448d" containerName="oc" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.152373 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="registry-server" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.153014 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.156608 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.158595 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.159511 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.160668 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563826-w6vsk"] Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.292386 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbk24\" (UniqueName: \"kubernetes.io/projected/16b96d84-1d96-4b9b-b266-522602e5000d-kube-api-access-vbk24\") pod \"auto-csr-approver-29563826-w6vsk\" (UID: \"16b96d84-1d96-4b9b-b266-522602e5000d\") " pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.394675 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbk24\" (UniqueName: \"kubernetes.io/projected/16b96d84-1d96-4b9b-b266-522602e5000d-kube-api-access-vbk24\") pod \"auto-csr-approver-29563826-w6vsk\" (UID: \"16b96d84-1d96-4b9b-b266-522602e5000d\") " pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.415270 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbk24\" (UniqueName: \"kubernetes.io/projected/16b96d84-1d96-4b9b-b266-522602e5000d-kube-api-access-vbk24\") pod \"auto-csr-approver-29563826-w6vsk\" (UID: \"16b96d84-1d96-4b9b-b266-522602e5000d\") " pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.474017 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.942240 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563826-w6vsk"] Mar 18 10:26:01 crc kubenswrapper[4778]: I0318 10:26:01.668574 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" event={"ID":"16b96d84-1d96-4b9b-b266-522602e5000d","Type":"ContainerStarted","Data":"3f35e7604c6ec1277d4a92b161011db1bc29d7c04c76f7cef434cc263c42ef3c"} Mar 18 10:26:03 crc kubenswrapper[4778]: I0318 10:26:03.689706 4778 generic.go:334] "Generic (PLEG): container finished" podID="16b96d84-1d96-4b9b-b266-522602e5000d" containerID="53ff744bef01ec78e09ff6d04137bcd8e4bacfa9ad131e28bbc23695a24879a8" exitCode=0 Mar 18 10:26:03 crc kubenswrapper[4778]: I0318 10:26:03.689773 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" event={"ID":"16b96d84-1d96-4b9b-b266-522602e5000d","Type":"ContainerDied","Data":"53ff744bef01ec78e09ff6d04137bcd8e4bacfa9ad131e28bbc23695a24879a8"} Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.249793 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.388479 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbk24\" (UniqueName: \"kubernetes.io/projected/16b96d84-1d96-4b9b-b266-522602e5000d-kube-api-access-vbk24\") pod \"16b96d84-1d96-4b9b-b266-522602e5000d\" (UID: \"16b96d84-1d96-4b9b-b266-522602e5000d\") " Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.401251 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b96d84-1d96-4b9b-b266-522602e5000d-kube-api-access-vbk24" (OuterVolumeSpecName: "kube-api-access-vbk24") pod "16b96d84-1d96-4b9b-b266-522602e5000d" (UID: "16b96d84-1d96-4b9b-b266-522602e5000d"). InnerVolumeSpecName "kube-api-access-vbk24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.490640 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbk24\" (UniqueName: \"kubernetes.io/projected/16b96d84-1d96-4b9b-b266-522602e5000d-kube-api-access-vbk24\") on node \"crc\" DevicePath \"\"" Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.713540 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" event={"ID":"16b96d84-1d96-4b9b-b266-522602e5000d","Type":"ContainerDied","Data":"3f35e7604c6ec1277d4a92b161011db1bc29d7c04c76f7cef434cc263c42ef3c"} Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.713827 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f35e7604c6ec1277d4a92b161011db1bc29d7c04c76f7cef434cc263c42ef3c" Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.713593 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:06 crc kubenswrapper[4778]: I0318 10:26:06.340412 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563820-xjqs9"] Mar 18 10:26:06 crc kubenswrapper[4778]: I0318 10:26:06.348298 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563820-xjqs9"] Mar 18 10:26:08 crc kubenswrapper[4778]: I0318 10:26:08.205749 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b4b190c-f80e-4256-9025-f04279c3b3db" path="/var/lib/kubelet/pods/0b4b190c-f80e-4256-9025-f04279c3b3db/volumes" Mar 18 10:26:30 crc kubenswrapper[4778]: I0318 10:26:30.147872 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:26:30 crc kubenswrapper[4778]: I0318 10:26:30.148409 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:26:35 crc kubenswrapper[4778]: I0318 10:26:35.359172 4778 scope.go:117] "RemoveContainer" containerID="f18fd314b43b902914cecf41007b7ceab63380321a7a2e1eeb69b1d4c6a07a98" Mar 18 10:27:00 crc kubenswrapper[4778]: I0318 10:27:00.148113 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:27:00 crc kubenswrapper[4778]: I0318 10:27:00.148818 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.147608 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.148308 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.148372 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.149532 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.149646 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" gracePeriod=600 Mar 18 10:27:30 crc kubenswrapper[4778]: E0318 10:27:30.285333 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.481147 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" exitCode=0 Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.481704 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36"} Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.481761 4778 scope.go:117] "RemoveContainer" containerID="4b8f66334209f29f6c78b08a10e261ea310c6e7d034012d49ee4adbb016d4215" Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.482732 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:27:30 crc kubenswrapper[4778]: E0318 10:27:30.483067 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:27:41 crc kubenswrapper[4778]: I0318 10:27:41.188519 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:27:41 crc kubenswrapper[4778]: E0318 10:27:41.189954 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.556641 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w2ksc"] Mar 18 10:27:44 crc kubenswrapper[4778]: E0318 10:27:44.557454 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b96d84-1d96-4b9b-b266-522602e5000d" containerName="oc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.557466 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b96d84-1d96-4b9b-b266-522602e5000d" containerName="oc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.557671 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b96d84-1d96-4b9b-b266-522602e5000d" containerName="oc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.558998 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.565914 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-catalog-content\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.566035 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7s2s\" (UniqueName: \"kubernetes.io/projected/b36149c6-492e-4fdd-8955-9b0ca1ab902c-kube-api-access-d7s2s\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.566141 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-utilities\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.570648 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2ksc"] Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.667235 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-catalog-content\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.667588 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7s2s\" (UniqueName: \"kubernetes.io/projected/b36149c6-492e-4fdd-8955-9b0ca1ab902c-kube-api-access-d7s2s\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.667651 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-utilities\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.667937 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-catalog-content\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.667988 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-utilities\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.691186 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7s2s\" (UniqueName: \"kubernetes.io/projected/b36149c6-492e-4fdd-8955-9b0ca1ab902c-kube-api-access-d7s2s\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.908431 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:45 crc kubenswrapper[4778]: I0318 10:27:45.424608 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2ksc"] Mar 18 10:27:45 crc kubenswrapper[4778]: I0318 10:27:45.640048 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerStarted","Data":"28a1912ae4357de244cb7f9f01d8d2ea6f7fb0931ecc2ea21316f9938d55ba8c"} Mar 18 10:27:46 crc kubenswrapper[4778]: I0318 10:27:46.650301 4778 generic.go:334] "Generic (PLEG): container finished" podID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerID="ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97" exitCode=0 Mar 18 10:27:46 crc kubenswrapper[4778]: I0318 10:27:46.650392 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerDied","Data":"ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97"} Mar 18 10:27:46 crc kubenswrapper[4778]: I0318 10:27:46.656273 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:27:48 crc kubenswrapper[4778]: I0318 10:27:48.694640 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerStarted","Data":"295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc"} Mar 18 10:27:52 crc kubenswrapper[4778]: I0318 10:27:52.736049 4778 generic.go:334] "Generic (PLEG): container finished" podID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerID="295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc" exitCode=0 Mar 18 10:27:52 crc kubenswrapper[4778]: I0318 10:27:52.736111 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerDied","Data":"295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc"} Mar 18 10:27:53 crc kubenswrapper[4778]: I0318 10:27:53.188315 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:27:53 crc kubenswrapper[4778]: E0318 10:27:53.189562 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:27:53 crc kubenswrapper[4778]: I0318 10:27:53.746024 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerStarted","Data":"f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24"} Mar 18 10:27:53 crc kubenswrapper[4778]: I0318 10:27:53.778480 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w2ksc" podStartSLOduration=3.312462954 podStartE2EDuration="9.778462744s" podCreationTimestamp="2026-03-18 10:27:44 +0000 UTC" firstStartedPulling="2026-03-18 10:27:46.655659571 +0000 UTC m=+5133.230404451" lastFinishedPulling="2026-03-18 10:27:53.121659401 +0000 UTC m=+5139.696404241" observedRunningTime="2026-03-18 10:27:53.771154475 +0000 UTC m=+5140.345899305" watchObservedRunningTime="2026-03-18 10:27:53.778462744 +0000 UTC m=+5140.353207574" Mar 18 10:27:54 crc kubenswrapper[4778]: I0318 10:27:54.908957 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:54 crc kubenswrapper[4778]: I0318 10:27:54.909314 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:55 crc kubenswrapper[4778]: I0318 10:27:55.963288 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2ksc" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" probeResult="failure" output=< Mar 18 10:27:55 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:27:55 crc kubenswrapper[4778]: > Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.141318 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563828-bg8zw"] Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.143179 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.145153 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.148058 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.148123 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.152859 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563828-bg8zw"] Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.194429 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsmgf\" (UniqueName: \"kubernetes.io/projected/6b507e48-f0a9-4938-ad91-298a6f90aad1-kube-api-access-rsmgf\") pod \"auto-csr-approver-29563828-bg8zw\" (UID: \"6b507e48-f0a9-4938-ad91-298a6f90aad1\") " pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.297593 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsmgf\" (UniqueName: \"kubernetes.io/projected/6b507e48-f0a9-4938-ad91-298a6f90aad1-kube-api-access-rsmgf\") pod \"auto-csr-approver-29563828-bg8zw\" (UID: \"6b507e48-f0a9-4938-ad91-298a6f90aad1\") " pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.320620 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsmgf\" (UniqueName: \"kubernetes.io/projected/6b507e48-f0a9-4938-ad91-298a6f90aad1-kube-api-access-rsmgf\") pod \"auto-csr-approver-29563828-bg8zw\" (UID: \"6b507e48-f0a9-4938-ad91-298a6f90aad1\") " pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.493749 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:01 crc kubenswrapper[4778]: I0318 10:28:01.023854 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563828-bg8zw"] Mar 18 10:28:01 crc kubenswrapper[4778]: I0318 10:28:01.808270 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" event={"ID":"6b507e48-f0a9-4938-ad91-298a6f90aad1","Type":"ContainerStarted","Data":"d393a87c854b4e694fede6b64fff4ff35037a5b637ac41f2b2a51acb65651b79"} Mar 18 10:28:02 crc kubenswrapper[4778]: I0318 10:28:02.820969 4778 generic.go:334] "Generic (PLEG): container finished" podID="6b507e48-f0a9-4938-ad91-298a6f90aad1" containerID="37abf2b00f0327218ef392055e69e5b8b65d6c2f27975137e66e86722fc34dea" exitCode=0 Mar 18 10:28:02 crc kubenswrapper[4778]: I0318 10:28:02.821179 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" event={"ID":"6b507e48-f0a9-4938-ad91-298a6f90aad1","Type":"ContainerDied","Data":"37abf2b00f0327218ef392055e69e5b8b65d6c2f27975137e66e86722fc34dea"} Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.360328 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.368100 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsmgf\" (UniqueName: \"kubernetes.io/projected/6b507e48-f0a9-4938-ad91-298a6f90aad1-kube-api-access-rsmgf\") pod \"6b507e48-f0a9-4938-ad91-298a6f90aad1\" (UID: \"6b507e48-f0a9-4938-ad91-298a6f90aad1\") " Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.374364 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b507e48-f0a9-4938-ad91-298a6f90aad1-kube-api-access-rsmgf" (OuterVolumeSpecName: "kube-api-access-rsmgf") pod "6b507e48-f0a9-4938-ad91-298a6f90aad1" (UID: "6b507e48-f0a9-4938-ad91-298a6f90aad1"). InnerVolumeSpecName "kube-api-access-rsmgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.470064 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsmgf\" (UniqueName: \"kubernetes.io/projected/6b507e48-f0a9-4938-ad91-298a6f90aad1-kube-api-access-rsmgf\") on node \"crc\" DevicePath \"\"" Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.842510 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" event={"ID":"6b507e48-f0a9-4938-ad91-298a6f90aad1","Type":"ContainerDied","Data":"d393a87c854b4e694fede6b64fff4ff35037a5b637ac41f2b2a51acb65651b79"} Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.843666 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d393a87c854b4e694fede6b64fff4ff35037a5b637ac41f2b2a51acb65651b79" Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.842579 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:05 crc kubenswrapper[4778]: I0318 10:28:05.441371 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563822-hg65s"] Mar 18 10:28:05 crc kubenswrapper[4778]: I0318 10:28:05.456160 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563822-hg65s"] Mar 18 10:28:05 crc kubenswrapper[4778]: I0318 10:28:05.970921 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2ksc" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" probeResult="failure" output=< Mar 18 10:28:05 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:28:05 crc kubenswrapper[4778]: > Mar 18 10:28:06 crc kubenswrapper[4778]: I0318 10:28:06.187008 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:28:06 crc kubenswrapper[4778]: E0318 10:28:06.187348 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:28:06 crc kubenswrapper[4778]: I0318 10:28:06.214359 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b71bd8a1-ed53-4e72-8316-7bf3774ee1d8" path="/var/lib/kubelet/pods/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8/volumes" Mar 18 10:28:15 crc kubenswrapper[4778]: I0318 10:28:15.978968 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2ksc" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" probeResult="failure" output=< Mar 18 10:28:15 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:28:15 crc kubenswrapper[4778]: > Mar 18 10:28:18 crc kubenswrapper[4778]: I0318 10:28:18.187874 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:28:18 crc kubenswrapper[4778]: E0318 10:28:18.188377 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:28:25 crc kubenswrapper[4778]: I0318 10:28:25.966071 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2ksc" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" probeResult="failure" output=< Mar 18 10:28:25 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:28:25 crc kubenswrapper[4778]: > Mar 18 10:28:30 crc kubenswrapper[4778]: I0318 10:28:30.186911 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:28:30 crc kubenswrapper[4778]: E0318 10:28:30.187626 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:28:34 crc kubenswrapper[4778]: I0318 10:28:34.963378 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:28:35 crc kubenswrapper[4778]: I0318 10:28:35.025614 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:28:35 crc kubenswrapper[4778]: I0318 10:28:35.199593 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w2ksc"] Mar 18 10:28:35 crc kubenswrapper[4778]: I0318 10:28:35.453512 4778 scope.go:117] "RemoveContainer" containerID="23fd2e1aeae3db5eb1358b08c80f32e6a7b3195affe83ad0ba4169976ea65d12" Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.094562 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w2ksc" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" containerID="cri-o://f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24" gracePeriod=2 Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.761267 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.922667 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-utilities\") pod \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.922722 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-catalog-content\") pod \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.922755 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7s2s\" (UniqueName: \"kubernetes.io/projected/b36149c6-492e-4fdd-8955-9b0ca1ab902c-kube-api-access-d7s2s\") pod \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.924963 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-utilities" (OuterVolumeSpecName: "utilities") pod "b36149c6-492e-4fdd-8955-9b0ca1ab902c" (UID: "b36149c6-492e-4fdd-8955-9b0ca1ab902c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.936752 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b36149c6-492e-4fdd-8955-9b0ca1ab902c-kube-api-access-d7s2s" (OuterVolumeSpecName: "kube-api-access-d7s2s") pod "b36149c6-492e-4fdd-8955-9b0ca1ab902c" (UID: "b36149c6-492e-4fdd-8955-9b0ca1ab902c"). InnerVolumeSpecName "kube-api-access-d7s2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.025118 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.025159 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7s2s\" (UniqueName: \"kubernetes.io/projected/b36149c6-492e-4fdd-8955-9b0ca1ab902c-kube-api-access-d7s2s\") on node \"crc\" DevicePath \"\"" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.058651 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b36149c6-492e-4fdd-8955-9b0ca1ab902c" (UID: "b36149c6-492e-4fdd-8955-9b0ca1ab902c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.105099 4778 generic.go:334] "Generic (PLEG): container finished" podID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerID="f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24" exitCode=0 Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.105160 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.105175 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerDied","Data":"f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24"} Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.106374 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerDied","Data":"28a1912ae4357de244cb7f9f01d8d2ea6f7fb0931ecc2ea21316f9938d55ba8c"} Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.106409 4778 scope.go:117] "RemoveContainer" containerID="f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.127149 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.136855 4778 scope.go:117] "RemoveContainer" containerID="295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.140912 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w2ksc"] Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.151268 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w2ksc"] Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.164618 4778 scope.go:117] "RemoveContainer" containerID="ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.212261 4778 scope.go:117] "RemoveContainer" containerID="f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24" Mar 18 10:28:37 crc kubenswrapper[4778]: E0318 10:28:37.212761 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24\": container with ID starting with f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24 not found: ID does not exist" containerID="f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.212809 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24"} err="failed to get container status \"f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24\": rpc error: code = NotFound desc = could not find container \"f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24\": container with ID starting with f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24 not found: ID does not exist" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.212832 4778 scope.go:117] "RemoveContainer" containerID="295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc" Mar 18 10:28:37 crc kubenswrapper[4778]: E0318 10:28:37.213255 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc\": container with ID starting with 295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc not found: ID does not exist" containerID="295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.213385 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc"} err="failed to get container status \"295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc\": rpc error: code = NotFound desc = could not find container \"295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc\": container with ID starting with 295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc not found: ID does not exist" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.213473 4778 scope.go:117] "RemoveContainer" containerID="ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97" Mar 18 10:28:37 crc kubenswrapper[4778]: E0318 10:28:37.213820 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97\": container with ID starting with ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97 not found: ID does not exist" containerID="ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.213846 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97"} err="failed to get container status \"ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97\": rpc error: code = NotFound desc = could not find container \"ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97\": container with ID starting with ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97 not found: ID does not exist" Mar 18 10:28:38 crc kubenswrapper[4778]: I0318 10:28:38.195722 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" path="/var/lib/kubelet/pods/b36149c6-492e-4fdd-8955-9b0ca1ab902c/volumes" Mar 18 10:28:41 crc kubenswrapper[4778]: I0318 10:28:41.188780 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:28:41 crc kubenswrapper[4778]: E0318 10:28:41.190935 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:28:55 crc kubenswrapper[4778]: I0318 10:28:55.187620 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:28:55 crc kubenswrapper[4778]: E0318 10:28:55.188929 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:29:11 crc kubenswrapper[4778]: I0318 10:29:10.187233 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:29:11 crc kubenswrapper[4778]: E0318 10:29:10.188275 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:29:25 crc kubenswrapper[4778]: I0318 10:29:25.186951 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:29:25 crc kubenswrapper[4778]: E0318 10:29:25.188997 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:29:36 crc kubenswrapper[4778]: I0318 10:29:36.190992 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:29:36 crc kubenswrapper[4778]: E0318 10:29:36.191664 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:29:47 crc kubenswrapper[4778]: I0318 10:29:47.188133 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:29:47 crc kubenswrapper[4778]: E0318 10:29:47.189071 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:29:58 crc kubenswrapper[4778]: I0318 10:29:58.187806 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:29:58 crc kubenswrapper[4778]: E0318 10:29:58.188707 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.153831 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563830-cmbbv"] Mar 18 10:30:00 crc kubenswrapper[4778]: E0318 10:30:00.154897 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="extract-content" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.154914 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="extract-content" Mar 18 10:30:00 crc kubenswrapper[4778]: E0318 10:30:00.154925 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="extract-utilities" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.154931 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="extract-utilities" Mar 18 10:30:00 crc kubenswrapper[4778]: E0318 10:30:00.154946 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.154951 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" Mar 18 10:30:00 crc kubenswrapper[4778]: E0318 10:30:00.154975 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b507e48-f0a9-4938-ad91-298a6f90aad1" containerName="oc" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.154981 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b507e48-f0a9-4938-ad91-298a6f90aad1" containerName="oc" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.155162 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.155185 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b507e48-f0a9-4938-ad91-298a6f90aad1" containerName="oc" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.155861 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.160838 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.161088 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.161235 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.166435 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m"] Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.167883 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.170391 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.171720 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.178056 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563830-cmbbv"] Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.211526 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m"] Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.234898 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/688101ed-133b-42c6-87f0-fb2ce2afa33f-secret-volume\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.235069 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdww\" (UniqueName: \"kubernetes.io/projected/688101ed-133b-42c6-87f0-fb2ce2afa33f-kube-api-access-9zdww\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.235238 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9j7b\" (UniqueName: \"kubernetes.io/projected/020a579d-1395-4039-8c3a-7454709e9af6-kube-api-access-g9j7b\") pod \"auto-csr-approver-29563830-cmbbv\" (UID: \"020a579d-1395-4039-8c3a-7454709e9af6\") " pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.235463 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/688101ed-133b-42c6-87f0-fb2ce2afa33f-config-volume\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.337619 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/688101ed-133b-42c6-87f0-fb2ce2afa33f-secret-volume\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.337695 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdww\" (UniqueName: \"kubernetes.io/projected/688101ed-133b-42c6-87f0-fb2ce2afa33f-kube-api-access-9zdww\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.337762 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9j7b\" (UniqueName: \"kubernetes.io/projected/020a579d-1395-4039-8c3a-7454709e9af6-kube-api-access-g9j7b\") pod \"auto-csr-approver-29563830-cmbbv\" (UID: \"020a579d-1395-4039-8c3a-7454709e9af6\") " pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.337835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/688101ed-133b-42c6-87f0-fb2ce2afa33f-config-volume\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.338706 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/688101ed-133b-42c6-87f0-fb2ce2afa33f-config-volume\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.351372 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/688101ed-133b-42c6-87f0-fb2ce2afa33f-secret-volume\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.353382 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdww\" (UniqueName: \"kubernetes.io/projected/688101ed-133b-42c6-87f0-fb2ce2afa33f-kube-api-access-9zdww\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.354033 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9j7b\" (UniqueName: \"kubernetes.io/projected/020a579d-1395-4039-8c3a-7454709e9af6-kube-api-access-g9j7b\") pod \"auto-csr-approver-29563830-cmbbv\" (UID: \"020a579d-1395-4039-8c3a-7454709e9af6\") " pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.481333 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.490869 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:01 crc kubenswrapper[4778]: I0318 10:30:01.077869 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m"] Mar 18 10:30:01 crc kubenswrapper[4778]: I0318 10:30:01.243402 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563830-cmbbv"] Mar 18 10:30:01 crc kubenswrapper[4778]: W0318 10:30:01.245844 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod020a579d_1395_4039_8c3a_7454709e9af6.slice/crio-4a7633c8556695ef997523713bddc0f5f3c863a3f390cbe59df35a4295532c74 WatchSource:0}: Error finding container 4a7633c8556695ef997523713bddc0f5f3c863a3f390cbe59df35a4295532c74: Status 404 returned error can't find the container with id 4a7633c8556695ef997523713bddc0f5f3c863a3f390cbe59df35a4295532c74 Mar 18 10:30:01 crc kubenswrapper[4778]: I0318 10:30:01.843182 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" event={"ID":"020a579d-1395-4039-8c3a-7454709e9af6","Type":"ContainerStarted","Data":"4a7633c8556695ef997523713bddc0f5f3c863a3f390cbe59df35a4295532c74"} Mar 18 10:30:01 crc kubenswrapper[4778]: I0318 10:30:01.845008 4778 generic.go:334] "Generic (PLEG): container finished" podID="688101ed-133b-42c6-87f0-fb2ce2afa33f" containerID="91439ddaf1c7b64a7912887de697803bd3f4ff4a97a1ee187c7b7ad2914b7556" exitCode=0 Mar 18 10:30:01 crc kubenswrapper[4778]: I0318 10:30:01.845058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" event={"ID":"688101ed-133b-42c6-87f0-fb2ce2afa33f","Type":"ContainerDied","Data":"91439ddaf1c7b64a7912887de697803bd3f4ff4a97a1ee187c7b7ad2914b7556"} Mar 18 10:30:01 crc kubenswrapper[4778]: I0318 10:30:01.845101 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" event={"ID":"688101ed-133b-42c6-87f0-fb2ce2afa33f","Type":"ContainerStarted","Data":"4c26e59735d7c3f2f271c0eb690c58016a6af6de5508e76c1d4af772c4103d48"} Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.416909 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.516420 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/688101ed-133b-42c6-87f0-fb2ce2afa33f-config-volume\") pod \"688101ed-133b-42c6-87f0-fb2ce2afa33f\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.516801 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/688101ed-133b-42c6-87f0-fb2ce2afa33f-secret-volume\") pod \"688101ed-133b-42c6-87f0-fb2ce2afa33f\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.516988 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zdww\" (UniqueName: \"kubernetes.io/projected/688101ed-133b-42c6-87f0-fb2ce2afa33f-kube-api-access-9zdww\") pod \"688101ed-133b-42c6-87f0-fb2ce2afa33f\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.519599 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/688101ed-133b-42c6-87f0-fb2ce2afa33f-config-volume" (OuterVolumeSpecName: "config-volume") pod "688101ed-133b-42c6-87f0-fb2ce2afa33f" (UID: "688101ed-133b-42c6-87f0-fb2ce2afa33f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.523311 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688101ed-133b-42c6-87f0-fb2ce2afa33f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "688101ed-133b-42c6-87f0-fb2ce2afa33f" (UID: "688101ed-133b-42c6-87f0-fb2ce2afa33f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.523600 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688101ed-133b-42c6-87f0-fb2ce2afa33f-kube-api-access-9zdww" (OuterVolumeSpecName: "kube-api-access-9zdww") pod "688101ed-133b-42c6-87f0-fb2ce2afa33f" (UID: "688101ed-133b-42c6-87f0-fb2ce2afa33f"). InnerVolumeSpecName "kube-api-access-9zdww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.619530 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zdww\" (UniqueName: \"kubernetes.io/projected/688101ed-133b-42c6-87f0-fb2ce2afa33f-kube-api-access-9zdww\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.619568 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/688101ed-133b-42c6-87f0-fb2ce2afa33f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.619576 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/688101ed-133b-42c6-87f0-fb2ce2afa33f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.863006 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.862993 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" event={"ID":"688101ed-133b-42c6-87f0-fb2ce2afa33f","Type":"ContainerDied","Data":"4c26e59735d7c3f2f271c0eb690c58016a6af6de5508e76c1d4af772c4103d48"} Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.863138 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c26e59735d7c3f2f271c0eb690c58016a6af6de5508e76c1d4af772c4103d48" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.864546 4778 generic.go:334] "Generic (PLEG): container finished" podID="020a579d-1395-4039-8c3a-7454709e9af6" containerID="e298907ce2b631ab1e3060efd7429ad70a3f2d93551c33b2a088ad16a12f01ae" exitCode=0 Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.864670 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" event={"ID":"020a579d-1395-4039-8c3a-7454709e9af6","Type":"ContainerDied","Data":"e298907ce2b631ab1e3060efd7429ad70a3f2d93551c33b2a088ad16a12f01ae"} Mar 18 10:30:04 crc kubenswrapper[4778]: I0318 10:30:04.500508 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq"] Mar 18 10:30:04 crc kubenswrapper[4778]: I0318 10:30:04.508100 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq"] Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.432363 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.557082 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9j7b\" (UniqueName: \"kubernetes.io/projected/020a579d-1395-4039-8c3a-7454709e9af6-kube-api-access-g9j7b\") pod \"020a579d-1395-4039-8c3a-7454709e9af6\" (UID: \"020a579d-1395-4039-8c3a-7454709e9af6\") " Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.567983 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020a579d-1395-4039-8c3a-7454709e9af6-kube-api-access-g9j7b" (OuterVolumeSpecName: "kube-api-access-g9j7b") pod "020a579d-1395-4039-8c3a-7454709e9af6" (UID: "020a579d-1395-4039-8c3a-7454709e9af6"). InnerVolumeSpecName "kube-api-access-g9j7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.660455 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9j7b\" (UniqueName: \"kubernetes.io/projected/020a579d-1395-4039-8c3a-7454709e9af6-kube-api-access-g9j7b\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.882891 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" event={"ID":"020a579d-1395-4039-8c3a-7454709e9af6","Type":"ContainerDied","Data":"4a7633c8556695ef997523713bddc0f5f3c863a3f390cbe59df35a4295532c74"} Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.882931 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a7633c8556695ef997523713bddc0f5f3c863a3f390cbe59df35a4295532c74" Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.882964 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:06 crc kubenswrapper[4778]: I0318 10:30:06.198554 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956ed194-df94-4b74-919f-9cdcfbdcf5a7" path="/var/lib/kubelet/pods/956ed194-df94-4b74-919f-9cdcfbdcf5a7/volumes" Mar 18 10:30:06 crc kubenswrapper[4778]: I0318 10:30:06.491150 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563824-cgc65"] Mar 18 10:30:06 crc kubenswrapper[4778]: I0318 10:30:06.498668 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563824-cgc65"] Mar 18 10:30:08 crc kubenswrapper[4778]: I0318 10:30:08.196685 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a10aa4ea-573d-4956-953f-4bdef827448d" path="/var/lib/kubelet/pods/a10aa4ea-573d-4956-953f-4bdef827448d/volumes" Mar 18 10:30:11 crc kubenswrapper[4778]: I0318 10:30:11.187018 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:30:11 crc kubenswrapper[4778]: E0318 10:30:11.187861 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:30:26 crc kubenswrapper[4778]: I0318 10:30:26.187302 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:30:26 crc kubenswrapper[4778]: E0318 10:30:26.188186 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:30:35 crc kubenswrapper[4778]: I0318 10:30:35.568863 4778 scope.go:117] "RemoveContainer" containerID="5830290d694881dc2acd7c8637c4816f24221da7b703db7749adb1ec9ce95a1b" Mar 18 10:30:35 crc kubenswrapper[4778]: I0318 10:30:35.631863 4778 scope.go:117] "RemoveContainer" containerID="b6a6fd51a98d9937da03ae4682cc5b4ae715e8495f9ae8fc3459feb811d9d2fc" Mar 18 10:30:37 crc kubenswrapper[4778]: I0318 10:30:37.189123 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:30:37 crc kubenswrapper[4778]: E0318 10:30:37.189943 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:30:49 crc kubenswrapper[4778]: I0318 10:30:49.187248 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:30:49 crc kubenswrapper[4778]: E0318 10:30:49.187899 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:31:03 crc kubenswrapper[4778]: I0318 10:31:03.188559 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:31:03 crc kubenswrapper[4778]: E0318 10:31:03.189466 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:31:16 crc kubenswrapper[4778]: I0318 10:31:16.187781 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:31:16 crc kubenswrapper[4778]: E0318 10:31:16.188564 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:31:31 crc kubenswrapper[4778]: I0318 10:31:31.187610 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:31:31 crc kubenswrapper[4778]: E0318 10:31:31.188413 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:31:42 crc kubenswrapper[4778]: I0318 10:31:42.187401 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:31:42 crc kubenswrapper[4778]: E0318 10:31:42.189336 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:31:53 crc kubenswrapper[4778]: I0318 10:31:53.188092 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:31:53 crc kubenswrapper[4778]: E0318 10:31:53.189274 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.155496 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563832-8vv9t"] Mar 18 10:32:00 crc kubenswrapper[4778]: E0318 10:32:00.156759 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020a579d-1395-4039-8c3a-7454709e9af6" containerName="oc" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.156783 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="020a579d-1395-4039-8c3a-7454709e9af6" containerName="oc" Mar 18 10:32:00 crc kubenswrapper[4778]: E0318 10:32:00.156847 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688101ed-133b-42c6-87f0-fb2ce2afa33f" containerName="collect-profiles" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.157052 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="688101ed-133b-42c6-87f0-fb2ce2afa33f" containerName="collect-profiles" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.157376 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="020a579d-1395-4039-8c3a-7454709e9af6" containerName="oc" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.157421 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="688101ed-133b-42c6-87f0-fb2ce2afa33f" containerName="collect-profiles" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.158627 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.163417 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.163980 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.165019 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.168797 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563832-8vv9t"] Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.300533 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q92v9\" (UniqueName: \"kubernetes.io/projected/19f52f16-1c49-4aa8-9e7b-10a9bf55e487-kube-api-access-q92v9\") pod \"auto-csr-approver-29563832-8vv9t\" (UID: \"19f52f16-1c49-4aa8-9e7b-10a9bf55e487\") " pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.403381 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q92v9\" (UniqueName: \"kubernetes.io/projected/19f52f16-1c49-4aa8-9e7b-10a9bf55e487-kube-api-access-q92v9\") pod \"auto-csr-approver-29563832-8vv9t\" (UID: \"19f52f16-1c49-4aa8-9e7b-10a9bf55e487\") " pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.439152 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q92v9\" (UniqueName: \"kubernetes.io/projected/19f52f16-1c49-4aa8-9e7b-10a9bf55e487-kube-api-access-q92v9\") pod \"auto-csr-approver-29563832-8vv9t\" (UID: \"19f52f16-1c49-4aa8-9e7b-10a9bf55e487\") " pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.488047 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.998915 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563832-8vv9t"] Mar 18 10:32:02 crc kubenswrapper[4778]: I0318 10:32:02.013776 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" event={"ID":"19f52f16-1c49-4aa8-9e7b-10a9bf55e487","Type":"ContainerStarted","Data":"1b386030482fe0667a16e9ddee22539810fc8e913e15ec7861949d47b76c20f2"} Mar 18 10:32:04 crc kubenswrapper[4778]: I0318 10:32:04.058426 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" event={"ID":"19f52f16-1c49-4aa8-9e7b-10a9bf55e487","Type":"ContainerDied","Data":"b3e45b3c111cb68776e6b0e92c1ba10a6ec66666310504f0fe918ad9a95b9a9e"} Mar 18 10:32:04 crc kubenswrapper[4778]: I0318 10:32:04.058272 4778 generic.go:334] "Generic (PLEG): container finished" podID="19f52f16-1c49-4aa8-9e7b-10a9bf55e487" containerID="b3e45b3c111cb68776e6b0e92c1ba10a6ec66666310504f0fe918ad9a95b9a9e" exitCode=0 Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.073689 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.082936 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" event={"ID":"19f52f16-1c49-4aa8-9e7b-10a9bf55e487","Type":"ContainerDied","Data":"1b386030482fe0667a16e9ddee22539810fc8e913e15ec7861949d47b76c20f2"} Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.082984 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b386030482fe0667a16e9ddee22539810fc8e913e15ec7861949d47b76c20f2" Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.083016 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.187463 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:32:06 crc kubenswrapper[4778]: E0318 10:32:06.187800 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.245933 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q92v9\" (UniqueName: \"kubernetes.io/projected/19f52f16-1c49-4aa8-9e7b-10a9bf55e487-kube-api-access-q92v9\") pod \"19f52f16-1c49-4aa8-9e7b-10a9bf55e487\" (UID: \"19f52f16-1c49-4aa8-9e7b-10a9bf55e487\") " Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.251965 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f52f16-1c49-4aa8-9e7b-10a9bf55e487-kube-api-access-q92v9" (OuterVolumeSpecName: "kube-api-access-q92v9") pod "19f52f16-1c49-4aa8-9e7b-10a9bf55e487" (UID: "19f52f16-1c49-4aa8-9e7b-10a9bf55e487"). InnerVolumeSpecName "kube-api-access-q92v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.348192 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q92v9\" (UniqueName: \"kubernetes.io/projected/19f52f16-1c49-4aa8-9e7b-10a9bf55e487-kube-api-access-q92v9\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.098757 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2xqrs"] Mar 18 10:32:07 crc kubenswrapper[4778]: E0318 10:32:07.099277 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f52f16-1c49-4aa8-9e7b-10a9bf55e487" containerName="oc" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.099295 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f52f16-1c49-4aa8-9e7b-10a9bf55e487" containerName="oc" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.099582 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f52f16-1c49-4aa8-9e7b-10a9bf55e487" containerName="oc" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.101333 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.109880 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2xqrs"] Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.167043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-utilities\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.167412 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lshv6\" (UniqueName: \"kubernetes.io/projected/85bd1932-7797-4e49-9b0c-a67b5176e03b-kube-api-access-lshv6\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.167665 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-catalog-content\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.174237 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563826-w6vsk"] Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.182572 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563826-w6vsk"] Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.269296 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-catalog-content\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.269361 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-utilities\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.269382 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lshv6\" (UniqueName: \"kubernetes.io/projected/85bd1932-7797-4e49-9b0c-a67b5176e03b-kube-api-access-lshv6\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.270474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-utilities\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.270497 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-catalog-content\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.287895 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lshv6\" (UniqueName: \"kubernetes.io/projected/85bd1932-7797-4e49-9b0c-a67b5176e03b-kube-api-access-lshv6\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.439875 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.995635 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2xqrs"] Mar 18 10:32:07 crc kubenswrapper[4778]: W0318 10:32:07.998566 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85bd1932_7797_4e49_9b0c_a67b5176e03b.slice/crio-b3b749541cd7e8e5266bf86b325dca76bccc806590445c6a2bf5c909ceffb2a2 WatchSource:0}: Error finding container b3b749541cd7e8e5266bf86b325dca76bccc806590445c6a2bf5c909ceffb2a2: Status 404 returned error can't find the container with id b3b749541cd7e8e5266bf86b325dca76bccc806590445c6a2bf5c909ceffb2a2 Mar 18 10:32:08 crc kubenswrapper[4778]: I0318 10:32:08.110126 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerStarted","Data":"b3b749541cd7e8e5266bf86b325dca76bccc806590445c6a2bf5c909ceffb2a2"} Mar 18 10:32:08 crc kubenswrapper[4778]: I0318 10:32:08.204774 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b96d84-1d96-4b9b-b266-522602e5000d" path="/var/lib/kubelet/pods/16b96d84-1d96-4b9b-b266-522602e5000d/volumes" Mar 18 10:32:09 crc kubenswrapper[4778]: I0318 10:32:09.119788 4778 generic.go:334] "Generic (PLEG): container finished" podID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerID="d4ab5a2a9be3faa54895e3ea4c33073a1984cc4328299d0127067c77c01301e8" exitCode=0 Mar 18 10:32:09 crc kubenswrapper[4778]: I0318 10:32:09.119850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerDied","Data":"d4ab5a2a9be3faa54895e3ea4c33073a1984cc4328299d0127067c77c01301e8"} Mar 18 10:32:10 crc kubenswrapper[4778]: I0318 10:32:10.130911 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerStarted","Data":"7efa6734e40475f70dcca064e44946d2776507dea2029d7bc3055a574ac8a45a"} Mar 18 10:32:12 crc kubenswrapper[4778]: I0318 10:32:12.173504 4778 generic.go:334] "Generic (PLEG): container finished" podID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerID="7efa6734e40475f70dcca064e44946d2776507dea2029d7bc3055a574ac8a45a" exitCode=0 Mar 18 10:32:12 crc kubenswrapper[4778]: I0318 10:32:12.173797 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerDied","Data":"7efa6734e40475f70dcca064e44946d2776507dea2029d7bc3055a574ac8a45a"} Mar 18 10:32:13 crc kubenswrapper[4778]: I0318 10:32:13.186004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerStarted","Data":"0bc612bd31fa753f8d2f562fbc9812f902bbf84c5c3fafa32c598ca1eddc2ac0"} Mar 18 10:32:13 crc kubenswrapper[4778]: I0318 10:32:13.214934 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2xqrs" podStartSLOduration=2.645592533 podStartE2EDuration="6.214913633s" podCreationTimestamp="2026-03-18 10:32:07 +0000 UTC" firstStartedPulling="2026-03-18 10:32:09.122009163 +0000 UTC m=+5395.696754003" lastFinishedPulling="2026-03-18 10:32:12.691330263 +0000 UTC m=+5399.266075103" observedRunningTime="2026-03-18 10:32:13.202581628 +0000 UTC m=+5399.777326508" watchObservedRunningTime="2026-03-18 10:32:13.214913633 +0000 UTC m=+5399.789658483" Mar 18 10:32:17 crc kubenswrapper[4778]: I0318 10:32:17.440388 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:17 crc kubenswrapper[4778]: I0318 10:32:17.440999 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:18 crc kubenswrapper[4778]: I0318 10:32:18.513301 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-2xqrs" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="registry-server" probeResult="failure" output=< Mar 18 10:32:18 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:32:18 crc kubenswrapper[4778]: > Mar 18 10:32:21 crc kubenswrapper[4778]: I0318 10:32:21.186842 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:32:21 crc kubenswrapper[4778]: E0318 10:32:21.187657 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:32:27 crc kubenswrapper[4778]: I0318 10:32:27.489015 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:27 crc kubenswrapper[4778]: I0318 10:32:27.541946 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:27 crc kubenswrapper[4778]: I0318 10:32:27.741508 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2xqrs"] Mar 18 10:32:29 crc kubenswrapper[4778]: I0318 10:32:29.331552 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2xqrs" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="registry-server" containerID="cri-o://0bc612bd31fa753f8d2f562fbc9812f902bbf84c5c3fafa32c598ca1eddc2ac0" gracePeriod=2 Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.345004 4778 generic.go:334] "Generic (PLEG): container finished" podID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerID="0bc612bd31fa753f8d2f562fbc9812f902bbf84c5c3fafa32c598ca1eddc2ac0" exitCode=0 Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.345104 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerDied","Data":"0bc612bd31fa753f8d2f562fbc9812f902bbf84c5c3fafa32c598ca1eddc2ac0"} Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.346275 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerDied","Data":"b3b749541cd7e8e5266bf86b325dca76bccc806590445c6a2bf5c909ceffb2a2"} Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.346414 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b749541cd7e8e5266bf86b325dca76bccc806590445c6a2bf5c909ceffb2a2" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.427347 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.449435 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lshv6\" (UniqueName: \"kubernetes.io/projected/85bd1932-7797-4e49-9b0c-a67b5176e03b-kube-api-access-lshv6\") pod \"85bd1932-7797-4e49-9b0c-a67b5176e03b\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.449799 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-utilities\") pod \"85bd1932-7797-4e49-9b0c-a67b5176e03b\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.449947 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-catalog-content\") pod \"85bd1932-7797-4e49-9b0c-a67b5176e03b\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.450799 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-utilities" (OuterVolumeSpecName: "utilities") pod "85bd1932-7797-4e49-9b0c-a67b5176e03b" (UID: "85bd1932-7797-4e49-9b0c-a67b5176e03b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.451055 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.455437 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85bd1932-7797-4e49-9b0c-a67b5176e03b-kube-api-access-lshv6" (OuterVolumeSpecName: "kube-api-access-lshv6") pod "85bd1932-7797-4e49-9b0c-a67b5176e03b" (UID: "85bd1932-7797-4e49-9b0c-a67b5176e03b"). InnerVolumeSpecName "kube-api-access-lshv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.512514 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85bd1932-7797-4e49-9b0c-a67b5176e03b" (UID: "85bd1932-7797-4e49-9b0c-a67b5176e03b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.553031 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lshv6\" (UniqueName: \"kubernetes.io/projected/85bd1932-7797-4e49-9b0c-a67b5176e03b-kube-api-access-lshv6\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.553078 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:31 crc kubenswrapper[4778]: I0318 10:32:31.352877 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:31 crc kubenswrapper[4778]: I0318 10:32:31.393021 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2xqrs"] Mar 18 10:32:31 crc kubenswrapper[4778]: I0318 10:32:31.403453 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2xqrs"] Mar 18 10:32:32 crc kubenswrapper[4778]: I0318 10:32:32.197177 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" path="/var/lib/kubelet/pods/85bd1932-7797-4e49-9b0c-a67b5176e03b/volumes" Mar 18 10:32:34 crc kubenswrapper[4778]: I0318 10:32:34.203560 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:32:35 crc kubenswrapper[4778]: I0318 10:32:35.386022 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"81ef1098ba06a49fc09cc93fdfc6d75d1ab7de8afe5ec57fbd67f4e799fb737b"} Mar 18 10:32:35 crc kubenswrapper[4778]: I0318 10:32:35.730426 4778 scope.go:117] "RemoveContainer" containerID="53ff744bef01ec78e09ff6d04137bcd8e4bacfa9ad131e28bbc23695a24879a8" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.707159 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6h72t"] Mar 18 10:33:36 crc kubenswrapper[4778]: E0318 10:33:36.709036 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="extract-content" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.709113 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="extract-content" Mar 18 10:33:36 crc kubenswrapper[4778]: E0318 10:33:36.709185 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="registry-server" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.709271 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="registry-server" Mar 18 10:33:36 crc kubenswrapper[4778]: E0318 10:33:36.709344 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="extract-utilities" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.709399 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="extract-utilities" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.709664 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="registry-server" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.710984 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.728315 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6h72t"] Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.837586 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-catalog-content\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.837654 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29ngl\" (UniqueName: \"kubernetes.io/projected/035e3898-3c1a-459c-9fee-a9e16ce10874-kube-api-access-29ngl\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.837848 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-utilities\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.939469 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-utilities\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.939547 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-catalog-content\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.939581 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29ngl\" (UniqueName: \"kubernetes.io/projected/035e3898-3c1a-459c-9fee-a9e16ce10874-kube-api-access-29ngl\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.940010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-utilities\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.940084 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-catalog-content\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.962175 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29ngl\" (UniqueName: \"kubernetes.io/projected/035e3898-3c1a-459c-9fee-a9e16ce10874-kube-api-access-29ngl\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:37 crc kubenswrapper[4778]: I0318 10:33:37.033012 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:37 crc kubenswrapper[4778]: I0318 10:33:37.626187 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6h72t"] Mar 18 10:33:37 crc kubenswrapper[4778]: I0318 10:33:37.974420 4778 generic.go:334] "Generic (PLEG): container finished" podID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerID="24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b" exitCode=0 Mar 18 10:33:37 crc kubenswrapper[4778]: I0318 10:33:37.974494 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerDied","Data":"24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b"} Mar 18 10:33:37 crc kubenswrapper[4778]: I0318 10:33:37.974532 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerStarted","Data":"fdd38a00e590f73ce57630781877646092f03c72baf6f4ae18abcb06af9c3f8c"} Mar 18 10:33:37 crc kubenswrapper[4778]: I0318 10:33:37.977385 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:33:38 crc kubenswrapper[4778]: I0318 10:33:38.984494 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerStarted","Data":"eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610"} Mar 18 10:33:41 crc kubenswrapper[4778]: I0318 10:33:41.012335 4778 generic.go:334] "Generic (PLEG): container finished" podID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerID="eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610" exitCode=0 Mar 18 10:33:41 crc kubenswrapper[4778]: I0318 10:33:41.012454 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerDied","Data":"eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610"} Mar 18 10:33:43 crc kubenswrapper[4778]: I0318 10:33:43.034851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerStarted","Data":"6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0"} Mar 18 10:33:43 crc kubenswrapper[4778]: I0318 10:33:43.053654 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6h72t" podStartSLOduration=3.000627977 podStartE2EDuration="7.053635076s" podCreationTimestamp="2026-03-18 10:33:36 +0000 UTC" firstStartedPulling="2026-03-18 10:33:37.977075228 +0000 UTC m=+5484.551820068" lastFinishedPulling="2026-03-18 10:33:42.030082327 +0000 UTC m=+5488.604827167" observedRunningTime="2026-03-18 10:33:43.052605628 +0000 UTC m=+5489.627350498" watchObservedRunningTime="2026-03-18 10:33:43.053635076 +0000 UTC m=+5489.628379926" Mar 18 10:33:47 crc kubenswrapper[4778]: I0318 10:33:47.033254 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:47 crc kubenswrapper[4778]: I0318 10:33:47.033902 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:47 crc kubenswrapper[4778]: I0318 10:33:47.097295 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:47 crc kubenswrapper[4778]: I0318 10:33:47.157593 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:47 crc kubenswrapper[4778]: I0318 10:33:47.698313 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6h72t"] Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.091350 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6h72t" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="registry-server" containerID="cri-o://6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0" gracePeriod=2 Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.693401 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.807533 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-utilities\") pod \"035e3898-3c1a-459c-9fee-a9e16ce10874\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.807600 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29ngl\" (UniqueName: \"kubernetes.io/projected/035e3898-3c1a-459c-9fee-a9e16ce10874-kube-api-access-29ngl\") pod \"035e3898-3c1a-459c-9fee-a9e16ce10874\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.807627 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-catalog-content\") pod \"035e3898-3c1a-459c-9fee-a9e16ce10874\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.808445 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-utilities" (OuterVolumeSpecName: "utilities") pod "035e3898-3c1a-459c-9fee-a9e16ce10874" (UID: "035e3898-3c1a-459c-9fee-a9e16ce10874"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.816055 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035e3898-3c1a-459c-9fee-a9e16ce10874-kube-api-access-29ngl" (OuterVolumeSpecName: "kube-api-access-29ngl") pod "035e3898-3c1a-459c-9fee-a9e16ce10874" (UID: "035e3898-3c1a-459c-9fee-a9e16ce10874"). InnerVolumeSpecName "kube-api-access-29ngl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.858565 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "035e3898-3c1a-459c-9fee-a9e16ce10874" (UID: "035e3898-3c1a-459c-9fee-a9e16ce10874"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.909553 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.909597 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29ngl\" (UniqueName: \"kubernetes.io/projected/035e3898-3c1a-459c-9fee-a9e16ce10874-kube-api-access-29ngl\") on node \"crc\" DevicePath \"\"" Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.909612 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.106160 4778 generic.go:334] "Generic (PLEG): container finished" podID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerID="6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0" exitCode=0 Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.106255 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.106264 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerDied","Data":"6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0"} Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.111425 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerDied","Data":"fdd38a00e590f73ce57630781877646092f03c72baf6f4ae18abcb06af9c3f8c"} Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.111462 4778 scope.go:117] "RemoveContainer" containerID="6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.147722 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6h72t"] Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.148478 4778 scope.go:117] "RemoveContainer" containerID="eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.160167 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6h72t"] Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.173974 4778 scope.go:117] "RemoveContainer" containerID="24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.199994 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" path="/var/lib/kubelet/pods/035e3898-3c1a-459c-9fee-a9e16ce10874/volumes" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.220930 4778 scope.go:117] "RemoveContainer" containerID="6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0" Mar 18 10:33:50 crc kubenswrapper[4778]: E0318 10:33:50.221493 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0\": container with ID starting with 6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0 not found: ID does not exist" containerID="6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.221546 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0"} err="failed to get container status \"6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0\": rpc error: code = NotFound desc = could not find container \"6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0\": container with ID starting with 6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0 not found: ID does not exist" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.221577 4778 scope.go:117] "RemoveContainer" containerID="eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610" Mar 18 10:33:50 crc kubenswrapper[4778]: E0318 10:33:50.222129 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610\": container with ID starting with eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610 not found: ID does not exist" containerID="eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.222231 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610"} err="failed to get container status \"eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610\": rpc error: code = NotFound desc = could not find container \"eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610\": container with ID starting with eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610 not found: ID does not exist" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.222277 4778 scope.go:117] "RemoveContainer" containerID="24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b" Mar 18 10:33:50 crc kubenswrapper[4778]: E0318 10:33:50.222599 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b\": container with ID starting with 24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b not found: ID does not exist" containerID="24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.222650 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b"} err="failed to get container status \"24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b\": rpc error: code = NotFound desc = could not find container \"24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b\": container with ID starting with 24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b not found: ID does not exist" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.167460 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563834-6gcpp"] Mar 18 10:34:00 crc kubenswrapper[4778]: E0318 10:34:00.168528 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="extract-utilities" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.168543 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="extract-utilities" Mar 18 10:34:00 crc kubenswrapper[4778]: E0318 10:34:00.168574 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="registry-server" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.168584 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="registry-server" Mar 18 10:34:00 crc kubenswrapper[4778]: E0318 10:34:00.168627 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="extract-content" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.168637 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="extract-content" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.168893 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="registry-server" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.169719 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.188106 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.188807 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.189000 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.213239 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563834-6gcpp"] Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.325004 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twbl7\" (UniqueName: \"kubernetes.io/projected/03c15483-f10b-4441-8d24-bb2bee9b47d3-kube-api-access-twbl7\") pod \"auto-csr-approver-29563834-6gcpp\" (UID: \"03c15483-f10b-4441-8d24-bb2bee9b47d3\") " pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.426986 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twbl7\" (UniqueName: \"kubernetes.io/projected/03c15483-f10b-4441-8d24-bb2bee9b47d3-kube-api-access-twbl7\") pod \"auto-csr-approver-29563834-6gcpp\" (UID: \"03c15483-f10b-4441-8d24-bb2bee9b47d3\") " pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.450105 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twbl7\" (UniqueName: \"kubernetes.io/projected/03c15483-f10b-4441-8d24-bb2bee9b47d3-kube-api-access-twbl7\") pod \"auto-csr-approver-29563834-6gcpp\" (UID: \"03c15483-f10b-4441-8d24-bb2bee9b47d3\") " pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.509319 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:01 crc kubenswrapper[4778]: I0318 10:34:01.010671 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563834-6gcpp"] Mar 18 10:34:01 crc kubenswrapper[4778]: I0318 10:34:01.225707 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" event={"ID":"03c15483-f10b-4441-8d24-bb2bee9b47d3","Type":"ContainerStarted","Data":"81c0dfe845c2e5038e0c59164d4e4e76a0f4a68e40905d577b613e6ea445a2ba"} Mar 18 10:34:03 crc kubenswrapper[4778]: I0318 10:34:03.244041 4778 generic.go:334] "Generic (PLEG): container finished" podID="03c15483-f10b-4441-8d24-bb2bee9b47d3" containerID="af6dff43614ee6bd06dbb1cfdbe51bab1c623028ca6850357300ea8b7c6fb33a" exitCode=0 Mar 18 10:34:03 crc kubenswrapper[4778]: I0318 10:34:03.244096 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" event={"ID":"03c15483-f10b-4441-8d24-bb2bee9b47d3","Type":"ContainerDied","Data":"af6dff43614ee6bd06dbb1cfdbe51bab1c623028ca6850357300ea8b7c6fb33a"} Mar 18 10:34:04 crc kubenswrapper[4778]: I0318 10:34:04.722507 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:04 crc kubenswrapper[4778]: I0318 10:34:04.820445 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twbl7\" (UniqueName: \"kubernetes.io/projected/03c15483-f10b-4441-8d24-bb2bee9b47d3-kube-api-access-twbl7\") pod \"03c15483-f10b-4441-8d24-bb2bee9b47d3\" (UID: \"03c15483-f10b-4441-8d24-bb2bee9b47d3\") " Mar 18 10:34:04 crc kubenswrapper[4778]: I0318 10:34:04.827998 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03c15483-f10b-4441-8d24-bb2bee9b47d3-kube-api-access-twbl7" (OuterVolumeSpecName: "kube-api-access-twbl7") pod "03c15483-f10b-4441-8d24-bb2bee9b47d3" (UID: "03c15483-f10b-4441-8d24-bb2bee9b47d3"). InnerVolumeSpecName "kube-api-access-twbl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:34:04 crc kubenswrapper[4778]: I0318 10:34:04.923908 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twbl7\" (UniqueName: \"kubernetes.io/projected/03c15483-f10b-4441-8d24-bb2bee9b47d3-kube-api-access-twbl7\") on node \"crc\" DevicePath \"\"" Mar 18 10:34:05 crc kubenswrapper[4778]: I0318 10:34:05.263844 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" event={"ID":"03c15483-f10b-4441-8d24-bb2bee9b47d3","Type":"ContainerDied","Data":"81c0dfe845c2e5038e0c59164d4e4e76a0f4a68e40905d577b613e6ea445a2ba"} Mar 18 10:34:05 crc kubenswrapper[4778]: I0318 10:34:05.263943 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81c0dfe845c2e5038e0c59164d4e4e76a0f4a68e40905d577b613e6ea445a2ba" Mar 18 10:34:05 crc kubenswrapper[4778]: I0318 10:34:05.263890 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:05 crc kubenswrapper[4778]: I0318 10:34:05.811181 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563828-bg8zw"] Mar 18 10:34:05 crc kubenswrapper[4778]: I0318 10:34:05.821136 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563828-bg8zw"] Mar 18 10:34:06 crc kubenswrapper[4778]: I0318 10:34:06.197545 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b507e48-f0a9-4938-ad91-298a6f90aad1" path="/var/lib/kubelet/pods/6b507e48-f0a9-4938-ad91-298a6f90aad1/volumes" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.609921 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zzj5b"] Mar 18 10:34:35 crc kubenswrapper[4778]: E0318 10:34:35.610845 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c15483-f10b-4441-8d24-bb2bee9b47d3" containerName="oc" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.610859 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c15483-f10b-4441-8d24-bb2bee9b47d3" containerName="oc" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.611043 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c15483-f10b-4441-8d24-bb2bee9b47d3" containerName="oc" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.612363 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.632695 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzj5b"] Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.747300 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-utilities\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.747411 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-catalog-content\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.747641 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzkg\" (UniqueName: \"kubernetes.io/projected/930e59dd-04aa-4030-b313-a1268b85ea06-kube-api-access-lvzkg\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.840125 4778 scope.go:117] "RemoveContainer" containerID="37abf2b00f0327218ef392055e69e5b8b65d6c2f27975137e66e86722fc34dea" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.850110 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzkg\" (UniqueName: \"kubernetes.io/projected/930e59dd-04aa-4030-b313-a1268b85ea06-kube-api-access-lvzkg\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.850265 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-utilities\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.850340 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-catalog-content\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.850906 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-catalog-content\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.851641 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-utilities\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.879037 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzkg\" (UniqueName: \"kubernetes.io/projected/930e59dd-04aa-4030-b313-a1268b85ea06-kube-api-access-lvzkg\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.941915 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:36 crc kubenswrapper[4778]: I0318 10:34:36.491477 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzj5b"] Mar 18 10:34:36 crc kubenswrapper[4778]: I0318 10:34:36.544059 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzj5b" event={"ID":"930e59dd-04aa-4030-b313-a1268b85ea06","Type":"ContainerStarted","Data":"1d6fc8b17c50bba9a49271f449e1493e2b8a87fde1e532a527688e36d2719ffc"} Mar 18 10:34:37 crc kubenswrapper[4778]: I0318 10:34:37.554145 4778 generic.go:334] "Generic (PLEG): container finished" podID="930e59dd-04aa-4030-b313-a1268b85ea06" containerID="c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c" exitCode=0 Mar 18 10:34:37 crc kubenswrapper[4778]: I0318 10:34:37.554184 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzj5b" event={"ID":"930e59dd-04aa-4030-b313-a1268b85ea06","Type":"ContainerDied","Data":"c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c"} Mar 18 10:34:39 crc kubenswrapper[4778]: I0318 10:34:39.575794 4778 generic.go:334] "Generic (PLEG): container finished" podID="930e59dd-04aa-4030-b313-a1268b85ea06" containerID="9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6" exitCode=0 Mar 18 10:34:39 crc kubenswrapper[4778]: I0318 10:34:39.575901 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzj5b" event={"ID":"930e59dd-04aa-4030-b313-a1268b85ea06","Type":"ContainerDied","Data":"9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6"} Mar 18 10:34:40 crc kubenswrapper[4778]: I0318 10:34:40.586516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzj5b" event={"ID":"930e59dd-04aa-4030-b313-a1268b85ea06","Type":"ContainerStarted","Data":"f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b"} Mar 18 10:34:40 crc kubenswrapper[4778]: I0318 10:34:40.613867 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zzj5b" podStartSLOduration=2.796360323 podStartE2EDuration="5.613844764s" podCreationTimestamp="2026-03-18 10:34:35 +0000 UTC" firstStartedPulling="2026-03-18 10:34:37.556341644 +0000 UTC m=+5544.131086484" lastFinishedPulling="2026-03-18 10:34:40.373826085 +0000 UTC m=+5546.948570925" observedRunningTime="2026-03-18 10:34:40.605943939 +0000 UTC m=+5547.180688789" watchObservedRunningTime="2026-03-18 10:34:40.613844764 +0000 UTC m=+5547.188589614" Mar 18 10:34:45 crc kubenswrapper[4778]: I0318 10:34:45.942051 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:45 crc kubenswrapper[4778]: I0318 10:34:45.942568 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:46 crc kubenswrapper[4778]: I0318 10:34:46.011423 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:46 crc kubenswrapper[4778]: I0318 10:34:46.722615 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:46 crc kubenswrapper[4778]: I0318 10:34:46.770713 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzj5b"] Mar 18 10:34:48 crc kubenswrapper[4778]: I0318 10:34:48.687705 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zzj5b" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="registry-server" containerID="cri-o://f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b" gracePeriod=2 Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.281938 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.330157 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvzkg\" (UniqueName: \"kubernetes.io/projected/930e59dd-04aa-4030-b313-a1268b85ea06-kube-api-access-lvzkg\") pod \"930e59dd-04aa-4030-b313-a1268b85ea06\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.330412 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-utilities\") pod \"930e59dd-04aa-4030-b313-a1268b85ea06\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.330459 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-catalog-content\") pod \"930e59dd-04aa-4030-b313-a1268b85ea06\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.332390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-utilities" (OuterVolumeSpecName: "utilities") pod "930e59dd-04aa-4030-b313-a1268b85ea06" (UID: "930e59dd-04aa-4030-b313-a1268b85ea06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.335967 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/930e59dd-04aa-4030-b313-a1268b85ea06-kube-api-access-lvzkg" (OuterVolumeSpecName: "kube-api-access-lvzkg") pod "930e59dd-04aa-4030-b313-a1268b85ea06" (UID: "930e59dd-04aa-4030-b313-a1268b85ea06"). InnerVolumeSpecName "kube-api-access-lvzkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.360931 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "930e59dd-04aa-4030-b313-a1268b85ea06" (UID: "930e59dd-04aa-4030-b313-a1268b85ea06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.433249 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvzkg\" (UniqueName: \"kubernetes.io/projected/930e59dd-04aa-4030-b313-a1268b85ea06-kube-api-access-lvzkg\") on node \"crc\" DevicePath \"\"" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.433301 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.433313 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.696370 4778 generic.go:334] "Generic (PLEG): container finished" podID="930e59dd-04aa-4030-b313-a1268b85ea06" containerID="f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b" exitCode=0 Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.696434 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.696452 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzj5b" event={"ID":"930e59dd-04aa-4030-b313-a1268b85ea06","Type":"ContainerDied","Data":"f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b"} Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.697564 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzj5b" event={"ID":"930e59dd-04aa-4030-b313-a1268b85ea06","Type":"ContainerDied","Data":"1d6fc8b17c50bba9a49271f449e1493e2b8a87fde1e532a527688e36d2719ffc"} Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.697594 4778 scope.go:117] "RemoveContainer" containerID="f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.722502 4778 scope.go:117] "RemoveContainer" containerID="9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.736882 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzj5b"] Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.749974 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzj5b"] Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.759303 4778 scope.go:117] "RemoveContainer" containerID="c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.787847 4778 scope.go:117] "RemoveContainer" containerID="f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b" Mar 18 10:34:49 crc kubenswrapper[4778]: E0318 10:34:49.788278 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b\": container with ID starting with f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b not found: ID does not exist" containerID="f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.788308 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b"} err="failed to get container status \"f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b\": rpc error: code = NotFound desc = could not find container \"f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b\": container with ID starting with f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b not found: ID does not exist" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.788328 4778 scope.go:117] "RemoveContainer" containerID="9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6" Mar 18 10:34:49 crc kubenswrapper[4778]: E0318 10:34:49.788690 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6\": container with ID starting with 9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6 not found: ID does not exist" containerID="9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.788714 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6"} err="failed to get container status \"9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6\": rpc error: code = NotFound desc = could not find container \"9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6\": container with ID starting with 9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6 not found: ID does not exist" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.788730 4778 scope.go:117] "RemoveContainer" containerID="c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c" Mar 18 10:34:49 crc kubenswrapper[4778]: E0318 10:34:49.788950 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c\": container with ID starting with c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c not found: ID does not exist" containerID="c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.788970 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c"} err="failed to get container status \"c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c\": rpc error: code = NotFound desc = could not find container \"c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c\": container with ID starting with c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c not found: ID does not exist" Mar 18 10:34:50 crc kubenswrapper[4778]: I0318 10:34:50.197811 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" path="/var/lib/kubelet/pods/930e59dd-04aa-4030-b313-a1268b85ea06/volumes" Mar 18 10:35:00 crc kubenswrapper[4778]: I0318 10:35:00.147642 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:35:00 crc kubenswrapper[4778]: I0318 10:35:00.149549 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:35:30 crc kubenswrapper[4778]: I0318 10:35:30.147665 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:35:30 crc kubenswrapper[4778]: I0318 10:35:30.148549 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.147705 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.149361 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.149480 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.150588 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81ef1098ba06a49fc09cc93fdfc6d75d1ab7de8afe5ec57fbd67f4e799fb737b"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.151011 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://81ef1098ba06a49fc09cc93fdfc6d75d1ab7de8afe5ec57fbd67f4e799fb737b" gracePeriod=600 Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.176797 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563836-vj8r7"] Mar 18 10:36:00 crc kubenswrapper[4778]: E0318 10:36:00.177189 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="extract-utilities" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.177223 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="extract-utilities" Mar 18 10:36:00 crc kubenswrapper[4778]: E0318 10:36:00.177232 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="extract-content" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.177239 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="extract-content" Mar 18 10:36:00 crc kubenswrapper[4778]: E0318 10:36:00.177248 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="registry-server" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.177255 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="registry-server" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.177476 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="registry-server" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.178065 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.186884 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.187117 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.191941 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.209288 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563836-vj8r7"] Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.348158 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmkl4\" (UniqueName: \"kubernetes.io/projected/e2bea522-9825-4193-9fb8-6592bcc1e2c8-kube-api-access-rmkl4\") pod \"auto-csr-approver-29563836-vj8r7\" (UID: \"e2bea522-9825-4193-9fb8-6592bcc1e2c8\") " pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.413495 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="81ef1098ba06a49fc09cc93fdfc6d75d1ab7de8afe5ec57fbd67f4e799fb737b" exitCode=0 Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.413585 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"81ef1098ba06a49fc09cc93fdfc6d75d1ab7de8afe5ec57fbd67f4e799fb737b"} Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.413636 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.450570 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmkl4\" (UniqueName: \"kubernetes.io/projected/e2bea522-9825-4193-9fb8-6592bcc1e2c8-kube-api-access-rmkl4\") pod \"auto-csr-approver-29563836-vj8r7\" (UID: \"e2bea522-9825-4193-9fb8-6592bcc1e2c8\") " pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.473225 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmkl4\" (UniqueName: \"kubernetes.io/projected/e2bea522-9825-4193-9fb8-6592bcc1e2c8-kube-api-access-rmkl4\") pod \"auto-csr-approver-29563836-vj8r7\" (UID: \"e2bea522-9825-4193-9fb8-6592bcc1e2c8\") " pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.512268 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:01 crc kubenswrapper[4778]: I0318 10:36:01.001457 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563836-vj8r7"] Mar 18 10:36:01 crc kubenswrapper[4778]: W0318 10:36:01.010279 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2bea522_9825_4193_9fb8_6592bcc1e2c8.slice/crio-fcf5a420915901661b1c58125c78978528b2f6cc296d9385b6bc42cb835e12fa WatchSource:0}: Error finding container fcf5a420915901661b1c58125c78978528b2f6cc296d9385b6bc42cb835e12fa: Status 404 returned error can't find the container with id fcf5a420915901661b1c58125c78978528b2f6cc296d9385b6bc42cb835e12fa Mar 18 10:36:01 crc kubenswrapper[4778]: I0318 10:36:01.424423 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5"} Mar 18 10:36:01 crc kubenswrapper[4778]: I0318 10:36:01.425871 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" event={"ID":"e2bea522-9825-4193-9fb8-6592bcc1e2c8","Type":"ContainerStarted","Data":"fcf5a420915901661b1c58125c78978528b2f6cc296d9385b6bc42cb835e12fa"} Mar 18 10:36:02 crc kubenswrapper[4778]: I0318 10:36:02.437307 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" event={"ID":"e2bea522-9825-4193-9fb8-6592bcc1e2c8","Type":"ContainerStarted","Data":"927eb69f67f4add2a332f661db657c63610291a16f8bad5b3ab6b4321fca9e64"} Mar 18 10:36:03 crc kubenswrapper[4778]: I0318 10:36:03.447517 4778 generic.go:334] "Generic (PLEG): container finished" podID="e2bea522-9825-4193-9fb8-6592bcc1e2c8" containerID="927eb69f67f4add2a332f661db657c63610291a16f8bad5b3ab6b4321fca9e64" exitCode=0 Mar 18 10:36:03 crc kubenswrapper[4778]: I0318 10:36:03.447664 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" event={"ID":"e2bea522-9825-4193-9fb8-6592bcc1e2c8","Type":"ContainerDied","Data":"927eb69f67f4add2a332f661db657c63610291a16f8bad5b3ab6b4321fca9e64"} Mar 18 10:36:04 crc kubenswrapper[4778]: I0318 10:36:04.959443 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.154949 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmkl4\" (UniqueName: \"kubernetes.io/projected/e2bea522-9825-4193-9fb8-6592bcc1e2c8-kube-api-access-rmkl4\") pod \"e2bea522-9825-4193-9fb8-6592bcc1e2c8\" (UID: \"e2bea522-9825-4193-9fb8-6592bcc1e2c8\") " Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.169611 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bea522-9825-4193-9fb8-6592bcc1e2c8-kube-api-access-rmkl4" (OuterVolumeSpecName: "kube-api-access-rmkl4") pod "e2bea522-9825-4193-9fb8-6592bcc1e2c8" (UID: "e2bea522-9825-4193-9fb8-6592bcc1e2c8"). InnerVolumeSpecName "kube-api-access-rmkl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.257927 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmkl4\" (UniqueName: \"kubernetes.io/projected/e2bea522-9825-4193-9fb8-6592bcc1e2c8-kube-api-access-rmkl4\") on node \"crc\" DevicePath \"\"" Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.485329 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" event={"ID":"e2bea522-9825-4193-9fb8-6592bcc1e2c8","Type":"ContainerDied","Data":"fcf5a420915901661b1c58125c78978528b2f6cc296d9385b6bc42cb835e12fa"} Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.485716 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcf5a420915901661b1c58125c78978528b2f6cc296d9385b6bc42cb835e12fa" Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.485499 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.555921 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563830-cmbbv"] Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.571098 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563830-cmbbv"] Mar 18 10:36:06 crc kubenswrapper[4778]: I0318 10:36:06.201621 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020a579d-1395-4039-8c3a-7454709e9af6" path="/var/lib/kubelet/pods/020a579d-1395-4039-8c3a-7454709e9af6/volumes" Mar 18 10:36:36 crc kubenswrapper[4778]: I0318 10:36:36.006235 4778 scope.go:117] "RemoveContainer" containerID="e298907ce2b631ab1e3060efd7429ad70a3f2d93551c33b2a088ad16a12f01ae" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.148313 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.149286 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.183252 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563838-xhh59"] Mar 18 10:38:00 crc kubenswrapper[4778]: E0318 10:38:00.183937 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bea522-9825-4193-9fb8-6592bcc1e2c8" containerName="oc" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.183965 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bea522-9825-4193-9fb8-6592bcc1e2c8" containerName="oc" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.184284 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bea522-9825-4193-9fb8-6592bcc1e2c8" containerName="oc" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.185382 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.188785 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.188848 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.188985 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.204806 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563838-xhh59"] Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.230834 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7njtv\" (UniqueName: \"kubernetes.io/projected/06d0e7b4-0fff-4364-bef0-a408acdbcdbb-kube-api-access-7njtv\") pod \"auto-csr-approver-29563838-xhh59\" (UID: \"06d0e7b4-0fff-4364-bef0-a408acdbcdbb\") " pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.332966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7njtv\" (UniqueName: \"kubernetes.io/projected/06d0e7b4-0fff-4364-bef0-a408acdbcdbb-kube-api-access-7njtv\") pod \"auto-csr-approver-29563838-xhh59\" (UID: \"06d0e7b4-0fff-4364-bef0-a408acdbcdbb\") " pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.363821 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7njtv\" (UniqueName: \"kubernetes.io/projected/06d0e7b4-0fff-4364-bef0-a408acdbcdbb-kube-api-access-7njtv\") pod \"auto-csr-approver-29563838-xhh59\" (UID: \"06d0e7b4-0fff-4364-bef0-a408acdbcdbb\") " pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.529062 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:01 crc kubenswrapper[4778]: I0318 10:38:01.016371 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563838-xhh59"] Mar 18 10:38:01 crc kubenswrapper[4778]: W0318 10:38:01.022181 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06d0e7b4_0fff_4364_bef0_a408acdbcdbb.slice/crio-95450eddfc15632f0a41a2ff2c235011258dec14c97bc62d889c97127ad1e24a WatchSource:0}: Error finding container 95450eddfc15632f0a41a2ff2c235011258dec14c97bc62d889c97127ad1e24a: Status 404 returned error can't find the container with id 95450eddfc15632f0a41a2ff2c235011258dec14c97bc62d889c97127ad1e24a Mar 18 10:38:01 crc kubenswrapper[4778]: I0318 10:38:01.794749 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563838-xhh59" event={"ID":"06d0e7b4-0fff-4364-bef0-a408acdbcdbb","Type":"ContainerStarted","Data":"95450eddfc15632f0a41a2ff2c235011258dec14c97bc62d889c97127ad1e24a"} Mar 18 10:38:02 crc kubenswrapper[4778]: I0318 10:38:02.809754 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563838-xhh59" event={"ID":"06d0e7b4-0fff-4364-bef0-a408acdbcdbb","Type":"ContainerStarted","Data":"646650d3ca5a64cb0621f10e2cff5ed5ebd691e96f89d98ddbd2ca48d396fddb"} Mar 18 10:38:02 crc kubenswrapper[4778]: I0318 10:38:02.835697 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563838-xhh59" podStartSLOduration=1.493443228 podStartE2EDuration="2.835679986s" podCreationTimestamp="2026-03-18 10:38:00 +0000 UTC" firstStartedPulling="2026-03-18 10:38:01.02487797 +0000 UTC m=+5747.599622850" lastFinishedPulling="2026-03-18 10:38:02.367114778 +0000 UTC m=+5748.941859608" observedRunningTime="2026-03-18 10:38:02.832348495 +0000 UTC m=+5749.407093345" watchObservedRunningTime="2026-03-18 10:38:02.835679986 +0000 UTC m=+5749.410424846" Mar 18 10:38:03 crc kubenswrapper[4778]: I0318 10:38:03.823775 4778 generic.go:334] "Generic (PLEG): container finished" podID="06d0e7b4-0fff-4364-bef0-a408acdbcdbb" containerID="646650d3ca5a64cb0621f10e2cff5ed5ebd691e96f89d98ddbd2ca48d396fddb" exitCode=0 Mar 18 10:38:03 crc kubenswrapper[4778]: I0318 10:38:03.823940 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563838-xhh59" event={"ID":"06d0e7b4-0fff-4364-bef0-a408acdbcdbb","Type":"ContainerDied","Data":"646650d3ca5a64cb0621f10e2cff5ed5ebd691e96f89d98ddbd2ca48d396fddb"} Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.333970 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.464575 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7njtv\" (UniqueName: \"kubernetes.io/projected/06d0e7b4-0fff-4364-bef0-a408acdbcdbb-kube-api-access-7njtv\") pod \"06d0e7b4-0fff-4364-bef0-a408acdbcdbb\" (UID: \"06d0e7b4-0fff-4364-bef0-a408acdbcdbb\") " Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.472460 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d0e7b4-0fff-4364-bef0-a408acdbcdbb-kube-api-access-7njtv" (OuterVolumeSpecName: "kube-api-access-7njtv") pod "06d0e7b4-0fff-4364-bef0-a408acdbcdbb" (UID: "06d0e7b4-0fff-4364-bef0-a408acdbcdbb"). InnerVolumeSpecName "kube-api-access-7njtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.566861 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7njtv\" (UniqueName: \"kubernetes.io/projected/06d0e7b4-0fff-4364-bef0-a408acdbcdbb-kube-api-access-7njtv\") on node \"crc\" DevicePath \"\"" Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.842712 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563838-xhh59" event={"ID":"06d0e7b4-0fff-4364-bef0-a408acdbcdbb","Type":"ContainerDied","Data":"95450eddfc15632f0a41a2ff2c235011258dec14c97bc62d889c97127ad1e24a"} Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.842758 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95450eddfc15632f0a41a2ff2c235011258dec14c97bc62d889c97127ad1e24a" Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.842792 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.908823 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563832-8vv9t"] Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.918231 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563832-8vv9t"] Mar 18 10:38:06 crc kubenswrapper[4778]: I0318 10:38:06.205284 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f52f16-1c49-4aa8-9e7b-10a9bf55e487" path="/var/lib/kubelet/pods/19f52f16-1c49-4aa8-9e7b-10a9bf55e487/volumes" Mar 18 10:38:30 crc kubenswrapper[4778]: I0318 10:38:30.147122 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:38:30 crc kubenswrapper[4778]: I0318 10:38:30.147844 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.496145 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zvhbg"] Mar 18 10:38:32 crc kubenswrapper[4778]: E0318 10:38:32.497472 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d0e7b4-0fff-4364-bef0-a408acdbcdbb" containerName="oc" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.497498 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d0e7b4-0fff-4364-bef0-a408acdbcdbb" containerName="oc" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.497823 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d0e7b4-0fff-4364-bef0-a408acdbcdbb" containerName="oc" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.500070 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.517970 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvhbg"] Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.642522 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-utilities\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.642852 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmn8p\" (UniqueName: \"kubernetes.io/projected/934fd461-09e2-4014-84fb-c5cdf66dd804-kube-api-access-tmn8p\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.642999 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-catalog-content\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.745249 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-utilities\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.745609 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmn8p\" (UniqueName: \"kubernetes.io/projected/934fd461-09e2-4014-84fb-c5cdf66dd804-kube-api-access-tmn8p\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.745706 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-catalog-content\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.745762 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-utilities\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.746271 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-catalog-content\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.780131 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmn8p\" (UniqueName: \"kubernetes.io/projected/934fd461-09e2-4014-84fb-c5cdf66dd804-kube-api-access-tmn8p\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.824935 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:33 crc kubenswrapper[4778]: I0318 10:38:33.260540 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvhbg"] Mar 18 10:38:34 crc kubenswrapper[4778]: I0318 10:38:34.107026 4778 generic.go:334] "Generic (PLEG): container finished" podID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerID="154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca" exitCode=0 Mar 18 10:38:34 crc kubenswrapper[4778]: I0318 10:38:34.107131 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerDied","Data":"154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca"} Mar 18 10:38:34 crc kubenswrapper[4778]: I0318 10:38:34.107383 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerStarted","Data":"99bf290ef30cb2d2aba1ab1168b5a2679f44c5fe78de34c90d88b0c1396003bf"} Mar 18 10:38:36 crc kubenswrapper[4778]: I0318 10:38:36.104334 4778 scope.go:117] "RemoveContainer" containerID="7efa6734e40475f70dcca064e44946d2776507dea2029d7bc3055a574ac8a45a" Mar 18 10:38:36 crc kubenswrapper[4778]: I0318 10:38:36.149417 4778 scope.go:117] "RemoveContainer" containerID="0bc612bd31fa753f8d2f562fbc9812f902bbf84c5c3fafa32c598ca1eddc2ac0" Mar 18 10:38:36 crc kubenswrapper[4778]: I0318 10:38:36.156848 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerStarted","Data":"b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6"} Mar 18 10:38:36 crc kubenswrapper[4778]: I0318 10:38:36.242410 4778 scope.go:117] "RemoveContainer" containerID="b3e45b3c111cb68776e6b0e92c1ba10a6ec66666310504f0fe918ad9a95b9a9e" Mar 18 10:38:36 crc kubenswrapper[4778]: I0318 10:38:36.314255 4778 scope.go:117] "RemoveContainer" containerID="d4ab5a2a9be3faa54895e3ea4c33073a1984cc4328299d0127067c77c01301e8" Mar 18 10:38:40 crc kubenswrapper[4778]: I0318 10:38:40.198729 4778 generic.go:334] "Generic (PLEG): container finished" podID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerID="b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6" exitCode=0 Mar 18 10:38:40 crc kubenswrapper[4778]: I0318 10:38:40.198811 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerDied","Data":"b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6"} Mar 18 10:38:40 crc kubenswrapper[4778]: I0318 10:38:40.202010 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:38:41 crc kubenswrapper[4778]: I0318 10:38:41.211225 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerStarted","Data":"b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b"} Mar 18 10:38:41 crc kubenswrapper[4778]: I0318 10:38:41.288313 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zvhbg" podStartSLOduration=2.7254747999999998 podStartE2EDuration="9.288291012s" podCreationTimestamp="2026-03-18 10:38:32 +0000 UTC" firstStartedPulling="2026-03-18 10:38:34.109231159 +0000 UTC m=+5780.683975999" lastFinishedPulling="2026-03-18 10:38:40.672047361 +0000 UTC m=+5787.246792211" observedRunningTime="2026-03-18 10:38:41.239105268 +0000 UTC m=+5787.813850138" watchObservedRunningTime="2026-03-18 10:38:41.288291012 +0000 UTC m=+5787.863035852" Mar 18 10:38:42 crc kubenswrapper[4778]: I0318 10:38:42.825541 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:42 crc kubenswrapper[4778]: I0318 10:38:42.826025 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:43 crc kubenswrapper[4778]: I0318 10:38:43.874567 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zvhbg" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="registry-server" probeResult="failure" output=< Mar 18 10:38:43 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:38:43 crc kubenswrapper[4778]: > Mar 18 10:38:52 crc kubenswrapper[4778]: I0318 10:38:52.884706 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:52 crc kubenswrapper[4778]: I0318 10:38:52.948885 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:53 crc kubenswrapper[4778]: I0318 10:38:53.123224 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvhbg"] Mar 18 10:38:54 crc kubenswrapper[4778]: I0318 10:38:54.315147 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zvhbg" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="registry-server" containerID="cri-o://b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b" gracePeriod=2 Mar 18 10:38:54 crc kubenswrapper[4778]: I0318 10:38:54.985598 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.099218 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-utilities\") pod \"934fd461-09e2-4014-84fb-c5cdf66dd804\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.099838 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-catalog-content\") pod \"934fd461-09e2-4014-84fb-c5cdf66dd804\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.099912 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmn8p\" (UniqueName: \"kubernetes.io/projected/934fd461-09e2-4014-84fb-c5cdf66dd804-kube-api-access-tmn8p\") pod \"934fd461-09e2-4014-84fb-c5cdf66dd804\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.100392 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-utilities" (OuterVolumeSpecName: "utilities") pod "934fd461-09e2-4014-84fb-c5cdf66dd804" (UID: "934fd461-09e2-4014-84fb-c5cdf66dd804"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.107774 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934fd461-09e2-4014-84fb-c5cdf66dd804-kube-api-access-tmn8p" (OuterVolumeSpecName: "kube-api-access-tmn8p") pod "934fd461-09e2-4014-84fb-c5cdf66dd804" (UID: "934fd461-09e2-4014-84fb-c5cdf66dd804"). InnerVolumeSpecName "kube-api-access-tmn8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.202023 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.202087 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmn8p\" (UniqueName: \"kubernetes.io/projected/934fd461-09e2-4014-84fb-c5cdf66dd804-kube-api-access-tmn8p\") on node \"crc\" DevicePath \"\"" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.253835 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "934fd461-09e2-4014-84fb-c5cdf66dd804" (UID: "934fd461-09e2-4014-84fb-c5cdf66dd804"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.303693 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.326831 4778 generic.go:334] "Generic (PLEG): container finished" podID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerID="b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b" exitCode=0 Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.326880 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerDied","Data":"b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b"} Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.326910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerDied","Data":"99bf290ef30cb2d2aba1ab1168b5a2679f44c5fe78de34c90d88b0c1396003bf"} Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.326908 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.326926 4778 scope.go:117] "RemoveContainer" containerID="b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.357186 4778 scope.go:117] "RemoveContainer" containerID="b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.371756 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvhbg"] Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.382790 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zvhbg"] Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.385485 4778 scope.go:117] "RemoveContainer" containerID="154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.429659 4778 scope.go:117] "RemoveContainer" containerID="b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b" Mar 18 10:38:55 crc kubenswrapper[4778]: E0318 10:38:55.431067 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b\": container with ID starting with b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b not found: ID does not exist" containerID="b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.431116 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b"} err="failed to get container status \"b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b\": rpc error: code = NotFound desc = could not find container \"b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b\": container with ID starting with b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b not found: ID does not exist" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.431149 4778 scope.go:117] "RemoveContainer" containerID="b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6" Mar 18 10:38:55 crc kubenswrapper[4778]: E0318 10:38:55.432288 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6\": container with ID starting with b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6 not found: ID does not exist" containerID="b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.432328 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6"} err="failed to get container status \"b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6\": rpc error: code = NotFound desc = could not find container \"b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6\": container with ID starting with b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6 not found: ID does not exist" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.432348 4778 scope.go:117] "RemoveContainer" containerID="154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca" Mar 18 10:38:55 crc kubenswrapper[4778]: E0318 10:38:55.433570 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca\": container with ID starting with 154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca not found: ID does not exist" containerID="154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.433615 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca"} err="failed to get container status \"154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca\": rpc error: code = NotFound desc = could not find container \"154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca\": container with ID starting with 154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca not found: ID does not exist" Mar 18 10:38:56 crc kubenswrapper[4778]: I0318 10:38:56.198287 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" path="/var/lib/kubelet/pods/934fd461-09e2-4014-84fb-c5cdf66dd804/volumes" Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.147938 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.149137 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.149242 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.150483 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.150566 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" gracePeriod=600 Mar 18 10:39:00 crc kubenswrapper[4778]: E0318 10:39:00.273920 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.377131 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" exitCode=0 Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.377226 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5"} Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.377500 4778 scope.go:117] "RemoveContainer" containerID="81ef1098ba06a49fc09cc93fdfc6d75d1ab7de8afe5ec57fbd67f4e799fb737b" Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.378132 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:39:00 crc kubenswrapper[4778]: E0318 10:39:00.378450 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:39:16 crc kubenswrapper[4778]: I0318 10:39:16.191012 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:39:16 crc kubenswrapper[4778]: E0318 10:39:16.191875 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:39:30 crc kubenswrapper[4778]: I0318 10:39:30.187406 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:39:30 crc kubenswrapper[4778]: E0318 10:39:30.188666 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:39:43 crc kubenswrapper[4778]: I0318 10:39:43.188753 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:39:43 crc kubenswrapper[4778]: E0318 10:39:43.189607 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:39:54 crc kubenswrapper[4778]: I0318 10:39:54.197516 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:39:54 crc kubenswrapper[4778]: E0318 10:39:54.198321 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.155950 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563840-fv5mm"] Mar 18 10:40:00 crc kubenswrapper[4778]: E0318 10:40:00.156856 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="extract-utilities" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.156871 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="extract-utilities" Mar 18 10:40:00 crc kubenswrapper[4778]: E0318 10:40:00.156894 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="extract-content" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.156919 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="extract-content" Mar 18 10:40:00 crc kubenswrapper[4778]: E0318 10:40:00.156939 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="registry-server" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.156949 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="registry-server" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.157180 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="registry-server" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.158125 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.160216 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.160295 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.160485 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.167699 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563840-fv5mm"] Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.245782 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcr6l\" (UniqueName: \"kubernetes.io/projected/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e-kube-api-access-zcr6l\") pod \"auto-csr-approver-29563840-fv5mm\" (UID: \"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e\") " pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.347448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcr6l\" (UniqueName: \"kubernetes.io/projected/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e-kube-api-access-zcr6l\") pod \"auto-csr-approver-29563840-fv5mm\" (UID: \"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e\") " pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.368139 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcr6l\" (UniqueName: \"kubernetes.io/projected/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e-kube-api-access-zcr6l\") pod \"auto-csr-approver-29563840-fv5mm\" (UID: \"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e\") " pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.482599 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.990122 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563840-fv5mm"] Mar 18 10:40:01 crc kubenswrapper[4778]: I0318 10:40:01.918404 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" event={"ID":"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e","Type":"ContainerStarted","Data":"49451722c5ce52687662ad1e30323c0a103145961602751f1e8768d627cdc406"} Mar 18 10:40:02 crc kubenswrapper[4778]: I0318 10:40:02.926443 4778 generic.go:334] "Generic (PLEG): container finished" podID="dc3d0dd7-380a-4f1c-bb78-f8df1a73362e" containerID="105700f78835bb2b225e76573d66e982ecb74a475715ed5f6fa69ca1e19eafce" exitCode=0 Mar 18 10:40:02 crc kubenswrapper[4778]: I0318 10:40:02.928153 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" event={"ID":"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e","Type":"ContainerDied","Data":"105700f78835bb2b225e76573d66e982ecb74a475715ed5f6fa69ca1e19eafce"} Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.558827 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.643886 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcr6l\" (UniqueName: \"kubernetes.io/projected/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e-kube-api-access-zcr6l\") pod \"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e\" (UID: \"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e\") " Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.655561 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e-kube-api-access-zcr6l" (OuterVolumeSpecName: "kube-api-access-zcr6l") pod "dc3d0dd7-380a-4f1c-bb78-f8df1a73362e" (UID: "dc3d0dd7-380a-4f1c-bb78-f8df1a73362e"). InnerVolumeSpecName "kube-api-access-zcr6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.746617 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcr6l\" (UniqueName: \"kubernetes.io/projected/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e-kube-api-access-zcr6l\") on node \"crc\" DevicePath \"\"" Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.944612 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" event={"ID":"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e","Type":"ContainerDied","Data":"49451722c5ce52687662ad1e30323c0a103145961602751f1e8768d627cdc406"} Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.944813 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.944901 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49451722c5ce52687662ad1e30323c0a103145961602751f1e8768d627cdc406" Mar 18 10:40:05 crc kubenswrapper[4778]: I0318 10:40:05.630711 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563834-6gcpp"] Mar 18 10:40:05 crc kubenswrapper[4778]: I0318 10:40:05.639514 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563834-6gcpp"] Mar 18 10:40:06 crc kubenswrapper[4778]: I0318 10:40:06.187409 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:40:06 crc kubenswrapper[4778]: E0318 10:40:06.187700 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:40:06 crc kubenswrapper[4778]: I0318 10:40:06.196419 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03c15483-f10b-4441-8d24-bb2bee9b47d3" path="/var/lib/kubelet/pods/03c15483-f10b-4441-8d24-bb2bee9b47d3/volumes" Mar 18 10:40:19 crc kubenswrapper[4778]: I0318 10:40:19.187337 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:40:19 crc kubenswrapper[4778]: E0318 10:40:19.188101 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:40:32 crc kubenswrapper[4778]: I0318 10:40:32.187809 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:40:32 crc kubenswrapper[4778]: E0318 10:40:32.188621 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:40:36 crc kubenswrapper[4778]: I0318 10:40:36.460586 4778 scope.go:117] "RemoveContainer" containerID="af6dff43614ee6bd06dbb1cfdbe51bab1c623028ca6850357300ea8b7c6fb33a" Mar 18 10:40:44 crc kubenswrapper[4778]: I0318 10:40:44.193558 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:40:44 crc kubenswrapper[4778]: E0318 10:40:44.195972 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:40:59 crc kubenswrapper[4778]: I0318 10:40:59.187634 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:40:59 crc kubenswrapper[4778]: E0318 10:40:59.188653 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:41:10 crc kubenswrapper[4778]: I0318 10:41:10.187743 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:41:10 crc kubenswrapper[4778]: E0318 10:41:10.188989 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:41:23 crc kubenswrapper[4778]: I0318 10:41:23.187969 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:41:23 crc kubenswrapper[4778]: E0318 10:41:23.189368 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:41:37 crc kubenswrapper[4778]: I0318 10:41:37.187051 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:41:37 crc kubenswrapper[4778]: E0318 10:41:37.188821 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:41:50 crc kubenswrapper[4778]: I0318 10:41:50.187024 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:41:50 crc kubenswrapper[4778]: E0318 10:41:50.188047 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.161227 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563842-ckkmp"] Mar 18 10:42:00 crc kubenswrapper[4778]: E0318 10:42:00.162723 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3d0dd7-380a-4f1c-bb78-f8df1a73362e" containerName="oc" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.162751 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3d0dd7-380a-4f1c-bb78-f8df1a73362e" containerName="oc" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.163254 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3d0dd7-380a-4f1c-bb78-f8df1a73362e" containerName="oc" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.164502 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.166790 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.167008 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.167086 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.174428 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563842-ckkmp"] Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.181023 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf9fm\" (UniqueName: \"kubernetes.io/projected/d10faaed-ffef-4afb-9f75-262e4fccd22a-kube-api-access-jf9fm\") pod \"auto-csr-approver-29563842-ckkmp\" (UID: \"d10faaed-ffef-4afb-9f75-262e4fccd22a\") " pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.283437 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf9fm\" (UniqueName: \"kubernetes.io/projected/d10faaed-ffef-4afb-9f75-262e4fccd22a-kube-api-access-jf9fm\") pod \"auto-csr-approver-29563842-ckkmp\" (UID: \"d10faaed-ffef-4afb-9f75-262e4fccd22a\") " pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.306998 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf9fm\" (UniqueName: \"kubernetes.io/projected/d10faaed-ffef-4afb-9f75-262e4fccd22a-kube-api-access-jf9fm\") pod \"auto-csr-approver-29563842-ckkmp\" (UID: \"d10faaed-ffef-4afb-9f75-262e4fccd22a\") " pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.487960 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.955984 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563842-ckkmp"] Mar 18 10:42:01 crc kubenswrapper[4778]: I0318 10:42:01.027038 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" event={"ID":"d10faaed-ffef-4afb-9f75-262e4fccd22a","Type":"ContainerStarted","Data":"ba2519ba60c141d970d8f26f1bfe017190303e07f1b25f56192925fa45efa340"} Mar 18 10:42:03 crc kubenswrapper[4778]: I0318 10:42:03.054556 4778 generic.go:334] "Generic (PLEG): container finished" podID="d10faaed-ffef-4afb-9f75-262e4fccd22a" containerID="cec3fa048e9699e703eb8a3404384f6f46bb6a98f37648f4a97cf2fe11dab009" exitCode=0 Mar 18 10:42:03 crc kubenswrapper[4778]: I0318 10:42:03.054925 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" event={"ID":"d10faaed-ffef-4afb-9f75-262e4fccd22a","Type":"ContainerDied","Data":"cec3fa048e9699e703eb8a3404384f6f46bb6a98f37648f4a97cf2fe11dab009"} Mar 18 10:42:04 crc kubenswrapper[4778]: I0318 10:42:04.595078 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:04 crc kubenswrapper[4778]: I0318 10:42:04.782043 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf9fm\" (UniqueName: \"kubernetes.io/projected/d10faaed-ffef-4afb-9f75-262e4fccd22a-kube-api-access-jf9fm\") pod \"d10faaed-ffef-4afb-9f75-262e4fccd22a\" (UID: \"d10faaed-ffef-4afb-9f75-262e4fccd22a\") " Mar 18 10:42:04 crc kubenswrapper[4778]: I0318 10:42:04.793597 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d10faaed-ffef-4afb-9f75-262e4fccd22a-kube-api-access-jf9fm" (OuterVolumeSpecName: "kube-api-access-jf9fm") pod "d10faaed-ffef-4afb-9f75-262e4fccd22a" (UID: "d10faaed-ffef-4afb-9f75-262e4fccd22a"). InnerVolumeSpecName "kube-api-access-jf9fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:42:04 crc kubenswrapper[4778]: I0318 10:42:04.884678 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf9fm\" (UniqueName: \"kubernetes.io/projected/d10faaed-ffef-4afb-9f75-262e4fccd22a-kube-api-access-jf9fm\") on node \"crc\" DevicePath \"\"" Mar 18 10:42:05 crc kubenswrapper[4778]: I0318 10:42:05.078744 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" event={"ID":"d10faaed-ffef-4afb-9f75-262e4fccd22a","Type":"ContainerDied","Data":"ba2519ba60c141d970d8f26f1bfe017190303e07f1b25f56192925fa45efa340"} Mar 18 10:42:05 crc kubenswrapper[4778]: I0318 10:42:05.078824 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba2519ba60c141d970d8f26f1bfe017190303e07f1b25f56192925fa45efa340" Mar 18 10:42:05 crc kubenswrapper[4778]: I0318 10:42:05.078933 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:05 crc kubenswrapper[4778]: I0318 10:42:05.192866 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:42:05 crc kubenswrapper[4778]: E0318 10:42:05.193345 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:42:05 crc kubenswrapper[4778]: I0318 10:42:05.685102 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563836-vj8r7"] Mar 18 10:42:05 crc kubenswrapper[4778]: I0318 10:42:05.699020 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563836-vj8r7"] Mar 18 10:42:06 crc kubenswrapper[4778]: I0318 10:42:06.202554 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2bea522-9825-4193-9fb8-6592bcc1e2c8" path="/var/lib/kubelet/pods/e2bea522-9825-4193-9fb8-6592bcc1e2c8/volumes" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.633068 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jdwm5"] Mar 18 10:42:13 crc kubenswrapper[4778]: E0318 10:42:13.634024 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10faaed-ffef-4afb-9f75-262e4fccd22a" containerName="oc" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.634038 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10faaed-ffef-4afb-9f75-262e4fccd22a" containerName="oc" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.634238 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10faaed-ffef-4afb-9f75-262e4fccd22a" containerName="oc" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.635590 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.659803 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-utilities\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.660143 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-catalog-content\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.660378 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxqvf\" (UniqueName: \"kubernetes.io/projected/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-kube-api-access-rxqvf\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.665802 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jdwm5"] Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.762042 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-utilities\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.762173 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-catalog-content\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.762214 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxqvf\" (UniqueName: \"kubernetes.io/projected/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-kube-api-access-rxqvf\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.762575 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-utilities\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.762632 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-catalog-content\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.788098 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxqvf\" (UniqueName: \"kubernetes.io/projected/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-kube-api-access-rxqvf\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.961535 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:14 crc kubenswrapper[4778]: I0318 10:42:14.514238 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jdwm5"] Mar 18 10:42:15 crc kubenswrapper[4778]: I0318 10:42:15.179191 4778 generic.go:334] "Generic (PLEG): container finished" podID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerID="e810276274c5d13e9f83a89d4970dccd708f26446183bed277d1f0891733f845" exitCode=0 Mar 18 10:42:15 crc kubenswrapper[4778]: I0318 10:42:15.179281 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerDied","Data":"e810276274c5d13e9f83a89d4970dccd708f26446183bed277d1f0891733f845"} Mar 18 10:42:15 crc kubenswrapper[4778]: I0318 10:42:15.179623 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerStarted","Data":"3b5256ad1745bed880d51fd8ff1844d200a8ecc298d03e6cb8e30102b2353e9f"} Mar 18 10:42:16 crc kubenswrapper[4778]: I0318 10:42:16.198600 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerStarted","Data":"3038181f3cf21b408bb5f7edd299880a51f254baf0e7004d42e611a268d7ae51"} Mar 18 10:42:18 crc kubenswrapper[4778]: I0318 10:42:18.188108 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:42:18 crc kubenswrapper[4778]: E0318 10:42:18.188964 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:42:18 crc kubenswrapper[4778]: I0318 10:42:18.211270 4778 generic.go:334] "Generic (PLEG): container finished" podID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerID="3038181f3cf21b408bb5f7edd299880a51f254baf0e7004d42e611a268d7ae51" exitCode=0 Mar 18 10:42:18 crc kubenswrapper[4778]: I0318 10:42:18.211326 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerDied","Data":"3038181f3cf21b408bb5f7edd299880a51f254baf0e7004d42e611a268d7ae51"} Mar 18 10:42:19 crc kubenswrapper[4778]: I0318 10:42:19.223696 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerStarted","Data":"d0f31064b207c07c3a3138a9916639084cca43b6ad3411af7cd72dd9b0e743df"} Mar 18 10:42:19 crc kubenswrapper[4778]: I0318 10:42:19.259190 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jdwm5" podStartSLOduration=2.795311554 podStartE2EDuration="6.259165868s" podCreationTimestamp="2026-03-18 10:42:13 +0000 UTC" firstStartedPulling="2026-03-18 10:42:15.181097728 +0000 UTC m=+6001.755842578" lastFinishedPulling="2026-03-18 10:42:18.644952052 +0000 UTC m=+6005.219696892" observedRunningTime="2026-03-18 10:42:19.250787291 +0000 UTC m=+6005.825532141" watchObservedRunningTime="2026-03-18 10:42:19.259165868 +0000 UTC m=+6005.833910718" Mar 18 10:42:23 crc kubenswrapper[4778]: I0318 10:42:23.962643 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:23 crc kubenswrapper[4778]: I0318 10:42:23.963118 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:24 crc kubenswrapper[4778]: I0318 10:42:24.020247 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:24 crc kubenswrapper[4778]: I0318 10:42:24.332123 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:24 crc kubenswrapper[4778]: I0318 10:42:24.381028 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jdwm5"] Mar 18 10:42:26 crc kubenswrapper[4778]: I0318 10:42:26.301380 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jdwm5" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="registry-server" containerID="cri-o://d0f31064b207c07c3a3138a9916639084cca43b6ad3411af7cd72dd9b0e743df" gracePeriod=2 Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.312996 4778 generic.go:334] "Generic (PLEG): container finished" podID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerID="d0f31064b207c07c3a3138a9916639084cca43b6ad3411af7cd72dd9b0e743df" exitCode=0 Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.313347 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerDied","Data":"d0f31064b207c07c3a3138a9916639084cca43b6ad3411af7cd72dd9b0e743df"} Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.433671 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.538717 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-catalog-content\") pod \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.538880 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-utilities\") pod \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.538992 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxqvf\" (UniqueName: \"kubernetes.io/projected/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-kube-api-access-rxqvf\") pod \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.539823 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-utilities" (OuterVolumeSpecName: "utilities") pod "52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" (UID: "52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.550082 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-kube-api-access-rxqvf" (OuterVolumeSpecName: "kube-api-access-rxqvf") pod "52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" (UID: "52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1"). InnerVolumeSpecName "kube-api-access-rxqvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.596294 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" (UID: "52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.644035 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxqvf\" (UniqueName: \"kubernetes.io/projected/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-kube-api-access-rxqvf\") on node \"crc\" DevicePath \"\"" Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.644069 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.644078 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.325041 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerDied","Data":"3b5256ad1745bed880d51fd8ff1844d200a8ecc298d03e6cb8e30102b2353e9f"} Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.325120 4778 scope.go:117] "RemoveContainer" containerID="d0f31064b207c07c3a3138a9916639084cca43b6ad3411af7cd72dd9b0e743df" Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.327112 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.354614 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jdwm5"] Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.360578 4778 scope.go:117] "RemoveContainer" containerID="3038181f3cf21b408bb5f7edd299880a51f254baf0e7004d42e611a268d7ae51" Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.368363 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jdwm5"] Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.425383 4778 scope.go:117] "RemoveContainer" containerID="e810276274c5d13e9f83a89d4970dccd708f26446183bed277d1f0891733f845" Mar 18 10:42:30 crc kubenswrapper[4778]: I0318 10:42:30.202629 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" path="/var/lib/kubelet/pods/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1/volumes" Mar 18 10:42:32 crc kubenswrapper[4778]: I0318 10:42:32.187357 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:42:32 crc kubenswrapper[4778]: E0318 10:42:32.187793 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:42:36 crc kubenswrapper[4778]: I0318 10:42:36.548305 4778 scope.go:117] "RemoveContainer" containerID="927eb69f67f4add2a332f661db657c63610291a16f8bad5b3ab6b4321fca9e64" Mar 18 10:42:45 crc kubenswrapper[4778]: I0318 10:42:45.187094 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:42:45 crc kubenswrapper[4778]: E0318 10:42:45.187980 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:42:59 crc kubenswrapper[4778]: I0318 10:42:59.187465 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:42:59 crc kubenswrapper[4778]: E0318 10:42:59.188477 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:43:10 crc kubenswrapper[4778]: I0318 10:43:10.189796 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:43:10 crc kubenswrapper[4778]: E0318 10:43:10.190855 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:43:21 crc kubenswrapper[4778]: I0318 10:43:21.187427 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:43:21 crc kubenswrapper[4778]: E0318 10:43:21.188217 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:43:34 crc kubenswrapper[4778]: I0318 10:43:34.202980 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:43:34 crc kubenswrapper[4778]: E0318 10:43:34.203898 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:43:49 crc kubenswrapper[4778]: I0318 10:43:49.187414 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:43:49 crc kubenswrapper[4778]: E0318 10:43:49.188157 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.163964 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563844-j7kb9"] Mar 18 10:44:00 crc kubenswrapper[4778]: E0318 10:44:00.165055 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="extract-content" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.165072 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="extract-content" Mar 18 10:44:00 crc kubenswrapper[4778]: E0318 10:44:00.165093 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="registry-server" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.165101 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="registry-server" Mar 18 10:44:00 crc kubenswrapper[4778]: E0318 10:44:00.165129 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="extract-utilities" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.165137 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="extract-utilities" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.165427 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="registry-server" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.166245 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.169056 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.169314 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.169448 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.179074 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563844-j7kb9"] Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.236724 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z8sj\" (UniqueName: \"kubernetes.io/projected/2098dac3-962e-4d75-b22f-81aadc768dc6-kube-api-access-5z8sj\") pod \"auto-csr-approver-29563844-j7kb9\" (UID: \"2098dac3-962e-4d75-b22f-81aadc768dc6\") " pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.339183 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z8sj\" (UniqueName: \"kubernetes.io/projected/2098dac3-962e-4d75-b22f-81aadc768dc6-kube-api-access-5z8sj\") pod \"auto-csr-approver-29563844-j7kb9\" (UID: \"2098dac3-962e-4d75-b22f-81aadc768dc6\") " pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.361436 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z8sj\" (UniqueName: \"kubernetes.io/projected/2098dac3-962e-4d75-b22f-81aadc768dc6-kube-api-access-5z8sj\") pod \"auto-csr-approver-29563844-j7kb9\" (UID: \"2098dac3-962e-4d75-b22f-81aadc768dc6\") " pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.518437 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.987957 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563844-j7kb9"] Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.990294 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:44:01 crc kubenswrapper[4778]: I0318 10:44:01.219770 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" event={"ID":"2098dac3-962e-4d75-b22f-81aadc768dc6","Type":"ContainerStarted","Data":"ce2c426af705132d69bfde3949558052b8bdd7965ad71de8653d098877649038"} Mar 18 10:44:02 crc kubenswrapper[4778]: I0318 10:44:02.235837 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" event={"ID":"2098dac3-962e-4d75-b22f-81aadc768dc6","Type":"ContainerStarted","Data":"88b38e0fbddd0bbafde023ebaf3f6bdcd76dd7a995f8e2d8d9c48c114d683213"} Mar 18 10:44:02 crc kubenswrapper[4778]: I0318 10:44:02.269121 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" podStartSLOduration=1.394289291 podStartE2EDuration="2.269099394s" podCreationTimestamp="2026-03-18 10:44:00 +0000 UTC" firstStartedPulling="2026-03-18 10:44:00.989946416 +0000 UTC m=+6107.564691266" lastFinishedPulling="2026-03-18 10:44:01.864756529 +0000 UTC m=+6108.439501369" observedRunningTime="2026-03-18 10:44:02.260695586 +0000 UTC m=+6108.835440466" watchObservedRunningTime="2026-03-18 10:44:02.269099394 +0000 UTC m=+6108.843844234" Mar 18 10:44:03 crc kubenswrapper[4778]: I0318 10:44:03.187074 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:44:03 crc kubenswrapper[4778]: I0318 10:44:03.325635 4778 generic.go:334] "Generic (PLEG): container finished" podID="2098dac3-962e-4d75-b22f-81aadc768dc6" containerID="88b38e0fbddd0bbafde023ebaf3f6bdcd76dd7a995f8e2d8d9c48c114d683213" exitCode=0 Mar 18 10:44:03 crc kubenswrapper[4778]: I0318 10:44:03.325692 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" event={"ID":"2098dac3-962e-4d75-b22f-81aadc768dc6","Type":"ContainerDied","Data":"88b38e0fbddd0bbafde023ebaf3f6bdcd76dd7a995f8e2d8d9c48c114d683213"} Mar 18 10:44:04 crc kubenswrapper[4778]: I0318 10:44:04.337132 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"90db09640ef7e7d8376edb316416510275b2a613d72a3ab935ed1da863b06852"} Mar 18 10:44:04 crc kubenswrapper[4778]: I0318 10:44:04.864095 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:04 crc kubenswrapper[4778]: I0318 10:44:04.948883 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z8sj\" (UniqueName: \"kubernetes.io/projected/2098dac3-962e-4d75-b22f-81aadc768dc6-kube-api-access-5z8sj\") pod \"2098dac3-962e-4d75-b22f-81aadc768dc6\" (UID: \"2098dac3-962e-4d75-b22f-81aadc768dc6\") " Mar 18 10:44:04 crc kubenswrapper[4778]: I0318 10:44:04.955663 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2098dac3-962e-4d75-b22f-81aadc768dc6-kube-api-access-5z8sj" (OuterVolumeSpecName: "kube-api-access-5z8sj") pod "2098dac3-962e-4d75-b22f-81aadc768dc6" (UID: "2098dac3-962e-4d75-b22f-81aadc768dc6"). InnerVolumeSpecName "kube-api-access-5z8sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:44:05 crc kubenswrapper[4778]: I0318 10:44:05.051598 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z8sj\" (UniqueName: \"kubernetes.io/projected/2098dac3-962e-4d75-b22f-81aadc768dc6-kube-api-access-5z8sj\") on node \"crc\" DevicePath \"\"" Mar 18 10:44:05 crc kubenswrapper[4778]: I0318 10:44:05.333917 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563838-xhh59"] Mar 18 10:44:05 crc kubenswrapper[4778]: I0318 10:44:05.342546 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563838-xhh59"] Mar 18 10:44:05 crc kubenswrapper[4778]: I0318 10:44:05.347067 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" event={"ID":"2098dac3-962e-4d75-b22f-81aadc768dc6","Type":"ContainerDied","Data":"ce2c426af705132d69bfde3949558052b8bdd7965ad71de8653d098877649038"} Mar 18 10:44:05 crc kubenswrapper[4778]: I0318 10:44:05.347113 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce2c426af705132d69bfde3949558052b8bdd7965ad71de8653d098877649038" Mar 18 10:44:05 crc kubenswrapper[4778]: I0318 10:44:05.347134 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:06 crc kubenswrapper[4778]: I0318 10:44:06.196945 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06d0e7b4-0fff-4364-bef0-a408acdbcdbb" path="/var/lib/kubelet/pods/06d0e7b4-0fff-4364-bef0-a408acdbcdbb/volumes" Mar 18 10:44:36 crc kubenswrapper[4778]: I0318 10:44:36.679184 4778 scope.go:117] "RemoveContainer" containerID="646650d3ca5a64cb0621f10e2cff5ed5ebd691e96f89d98ddbd2ca48d396fddb" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.171068 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct"] Mar 18 10:45:00 crc kubenswrapper[4778]: E0318 10:45:00.172215 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2098dac3-962e-4d75-b22f-81aadc768dc6" containerName="oc" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.172232 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2098dac3-962e-4d75-b22f-81aadc768dc6" containerName="oc" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.172480 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2098dac3-962e-4d75-b22f-81aadc768dc6" containerName="oc" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.173252 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.177157 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.177460 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.205308 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct"] Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.357084 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-secret-volume\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.357334 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbf52\" (UniqueName: \"kubernetes.io/projected/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-kube-api-access-xbf52\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.357653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-config-volume\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.461068 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbf52\" (UniqueName: \"kubernetes.io/projected/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-kube-api-access-xbf52\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.461169 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-config-volume\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.461304 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-secret-volume\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.463039 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-config-volume\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.476410 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-secret-volume\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.490816 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbf52\" (UniqueName: \"kubernetes.io/projected/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-kube-api-access-xbf52\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.503539 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.966876 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct"] Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.990532 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" event={"ID":"b11278a5-f162-4ea2-abf3-dd1176b7ef1f","Type":"ContainerStarted","Data":"d67285ff0e86fa649bded598eaf80fa023059cdc1cd55d6ed0139326637ce920"} Mar 18 10:45:02 crc kubenswrapper[4778]: I0318 10:45:02.000395 4778 generic.go:334] "Generic (PLEG): container finished" podID="b11278a5-f162-4ea2-abf3-dd1176b7ef1f" containerID="52074f34225e99f56da69aa3b63f9abb2e62cdce5a1ef7447dd39a3f7432ef79" exitCode=0 Mar 18 10:45:02 crc kubenswrapper[4778]: I0318 10:45:02.000519 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" event={"ID":"b11278a5-f162-4ea2-abf3-dd1176b7ef1f","Type":"ContainerDied","Data":"52074f34225e99f56da69aa3b63f9abb2e62cdce5a1ef7447dd39a3f7432ef79"} Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.474077 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.629571 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-secret-volume\") pod \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.629946 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-config-volume\") pod \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.630038 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbf52\" (UniqueName: \"kubernetes.io/projected/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-kube-api-access-xbf52\") pod \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.630566 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "b11278a5-f162-4ea2-abf3-dd1176b7ef1f" (UID: "b11278a5-f162-4ea2-abf3-dd1176b7ef1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.630899 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.635903 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b11278a5-f162-4ea2-abf3-dd1176b7ef1f" (UID: "b11278a5-f162-4ea2-abf3-dd1176b7ef1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.636380 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-kube-api-access-xbf52" (OuterVolumeSpecName: "kube-api-access-xbf52") pod "b11278a5-f162-4ea2-abf3-dd1176b7ef1f" (UID: "b11278a5-f162-4ea2-abf3-dd1176b7ef1f"). InnerVolumeSpecName "kube-api-access-xbf52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.731873 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbf52\" (UniqueName: \"kubernetes.io/projected/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-kube-api-access-xbf52\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.731910 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:04 crc kubenswrapper[4778]: I0318 10:45:04.026236 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" event={"ID":"b11278a5-f162-4ea2-abf3-dd1176b7ef1f","Type":"ContainerDied","Data":"d67285ff0e86fa649bded598eaf80fa023059cdc1cd55d6ed0139326637ce920"} Mar 18 10:45:04 crc kubenswrapper[4778]: I0318 10:45:04.026301 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d67285ff0e86fa649bded598eaf80fa023059cdc1cd55d6ed0139326637ce920" Mar 18 10:45:04 crc kubenswrapper[4778]: I0318 10:45:04.026391 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:04 crc kubenswrapper[4778]: I0318 10:45:04.565695 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl"] Mar 18 10:45:04 crc kubenswrapper[4778]: I0318 10:45:04.576906 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl"] Mar 18 10:45:06 crc kubenswrapper[4778]: I0318 10:45:06.207160 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca9f1133-0fec-4eeb-8b9b-39148a035a92" path="/var/lib/kubelet/pods/ca9f1133-0fec-4eeb-8b9b-39148a035a92/volumes" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.623381 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v6wj7"] Mar 18 10:45:33 crc kubenswrapper[4778]: E0318 10:45:33.624782 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11278a5-f162-4ea2-abf3-dd1176b7ef1f" containerName="collect-profiles" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.624809 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11278a5-f162-4ea2-abf3-dd1176b7ef1f" containerName="collect-profiles" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.625146 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b11278a5-f162-4ea2-abf3-dd1176b7ef1f" containerName="collect-profiles" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.628055 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.635726 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6wj7"] Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.775253 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x4jn\" (UniqueName: \"kubernetes.io/projected/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-kube-api-access-4x4jn\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.775468 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-utilities\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.775499 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-catalog-content\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.877533 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-utilities\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.877595 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-catalog-content\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.877657 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x4jn\" (UniqueName: \"kubernetes.io/projected/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-kube-api-access-4x4jn\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.878273 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-utilities\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.878326 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-catalog-content\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.898749 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x4jn\" (UniqueName: \"kubernetes.io/projected/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-kube-api-access-4x4jn\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.961793 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:34 crc kubenswrapper[4778]: W0318 10:45:34.435703 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64c0c4b8_b1b6_4912_88c7_61be4fe4b899.slice/crio-f35f951c4605b5ffd52c3be40975f85b6fb2b61cc01c221181a9065aa7ef06fc WatchSource:0}: Error finding container f35f951c4605b5ffd52c3be40975f85b6fb2b61cc01c221181a9065aa7ef06fc: Status 404 returned error can't find the container with id f35f951c4605b5ffd52c3be40975f85b6fb2b61cc01c221181a9065aa7ef06fc Mar 18 10:45:34 crc kubenswrapper[4778]: I0318 10:45:34.439601 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6wj7"] Mar 18 10:45:35 crc kubenswrapper[4778]: I0318 10:45:35.366890 4778 generic.go:334] "Generic (PLEG): container finished" podID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerID="2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44" exitCode=0 Mar 18 10:45:35 crc kubenswrapper[4778]: I0318 10:45:35.366962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerDied","Data":"2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44"} Mar 18 10:45:35 crc kubenswrapper[4778]: I0318 10:45:35.367539 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerStarted","Data":"f35f951c4605b5ffd52c3be40975f85b6fb2b61cc01c221181a9065aa7ef06fc"} Mar 18 10:45:36 crc kubenswrapper[4778]: I0318 10:45:36.378985 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerStarted","Data":"be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3"} Mar 18 10:45:36 crc kubenswrapper[4778]: I0318 10:45:36.751957 4778 scope.go:117] "RemoveContainer" containerID="20eba30be4d8526eb64b11cc9e3c58803630e3554035c19c9650d8cecb2ebf82" Mar 18 10:45:37 crc kubenswrapper[4778]: I0318 10:45:37.386543 4778 generic.go:334] "Generic (PLEG): container finished" podID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerID="be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3" exitCode=0 Mar 18 10:45:37 crc kubenswrapper[4778]: I0318 10:45:37.386570 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerDied","Data":"be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3"} Mar 18 10:45:38 crc kubenswrapper[4778]: I0318 10:45:38.401667 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerStarted","Data":"7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810"} Mar 18 10:45:38 crc kubenswrapper[4778]: I0318 10:45:38.427170 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v6wj7" podStartSLOduration=2.991236952 podStartE2EDuration="5.427156829s" podCreationTimestamp="2026-03-18 10:45:33 +0000 UTC" firstStartedPulling="2026-03-18 10:45:35.369626895 +0000 UTC m=+6201.944371735" lastFinishedPulling="2026-03-18 10:45:37.805546762 +0000 UTC m=+6204.380291612" observedRunningTime="2026-03-18 10:45:38.418625128 +0000 UTC m=+6204.993369978" watchObservedRunningTime="2026-03-18 10:45:38.427156829 +0000 UTC m=+6205.001901659" Mar 18 10:45:43 crc kubenswrapper[4778]: I0318 10:45:43.962927 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:43 crc kubenswrapper[4778]: I0318 10:45:43.963599 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:44 crc kubenswrapper[4778]: I0318 10:45:44.008642 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:44 crc kubenswrapper[4778]: I0318 10:45:44.514650 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:44 crc kubenswrapper[4778]: I0318 10:45:44.578380 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6wj7"] Mar 18 10:45:46 crc kubenswrapper[4778]: I0318 10:45:46.485152 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v6wj7" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="registry-server" containerID="cri-o://7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810" gracePeriod=2 Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.434313 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.495470 4778 generic.go:334] "Generic (PLEG): container finished" podID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerID="7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810" exitCode=0 Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.495500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerDied","Data":"7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810"} Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.495533 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.495551 4778 scope.go:117] "RemoveContainer" containerID="7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.495540 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerDied","Data":"f35f951c4605b5ffd52c3be40975f85b6fb2b61cc01c221181a9065aa7ef06fc"} Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.513977 4778 scope.go:117] "RemoveContainer" containerID="be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.534345 4778 scope.go:117] "RemoveContainer" containerID="2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.590059 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-utilities\") pod \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.590225 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x4jn\" (UniqueName: \"kubernetes.io/projected/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-kube-api-access-4x4jn\") pod \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.590298 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-catalog-content\") pod \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.591442 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-utilities" (OuterVolumeSpecName: "utilities") pod "64c0c4b8-b1b6-4912-88c7-61be4fe4b899" (UID: "64c0c4b8-b1b6-4912-88c7-61be4fe4b899"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.594490 4778 scope.go:117] "RemoveContainer" containerID="7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810" Mar 18 10:45:47 crc kubenswrapper[4778]: E0318 10:45:47.595049 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810\": container with ID starting with 7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810 not found: ID does not exist" containerID="7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.595098 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810"} err="failed to get container status \"7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810\": rpc error: code = NotFound desc = could not find container \"7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810\": container with ID starting with 7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810 not found: ID does not exist" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.595127 4778 scope.go:117] "RemoveContainer" containerID="be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3" Mar 18 10:45:47 crc kubenswrapper[4778]: E0318 10:45:47.595503 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3\": container with ID starting with be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3 not found: ID does not exist" containerID="be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.595546 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3"} err="failed to get container status \"be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3\": rpc error: code = NotFound desc = could not find container \"be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3\": container with ID starting with be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3 not found: ID does not exist" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.595561 4778 scope.go:117] "RemoveContainer" containerID="2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44" Mar 18 10:45:47 crc kubenswrapper[4778]: E0318 10:45:47.596260 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44\": container with ID starting with 2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44 not found: ID does not exist" containerID="2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.596307 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44"} err="failed to get container status \"2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44\": rpc error: code = NotFound desc = could not find container \"2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44\": container with ID starting with 2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44 not found: ID does not exist" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.598057 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-kube-api-access-4x4jn" (OuterVolumeSpecName: "kube-api-access-4x4jn") pod "64c0c4b8-b1b6-4912-88c7-61be4fe4b899" (UID: "64c0c4b8-b1b6-4912-88c7-61be4fe4b899"). InnerVolumeSpecName "kube-api-access-4x4jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.615562 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64c0c4b8-b1b6-4912-88c7-61be4fe4b899" (UID: "64c0c4b8-b1b6-4912-88c7-61be4fe4b899"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.692609 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x4jn\" (UniqueName: \"kubernetes.io/projected/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-kube-api-access-4x4jn\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.692647 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.692661 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.848954 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6wj7"] Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.860675 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6wj7"] Mar 18 10:45:48 crc kubenswrapper[4778]: I0318 10:45:48.211391 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" path="/var/lib/kubelet/pods/64c0c4b8-b1b6-4912-88c7-61be4fe4b899/volumes" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.182152 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563846-jwrl9"] Mar 18 10:46:00 crc kubenswrapper[4778]: E0318 10:46:00.183114 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="extract-content" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.183131 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="extract-content" Mar 18 10:46:00 crc kubenswrapper[4778]: E0318 10:46:00.183156 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="extract-utilities" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.183163 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="extract-utilities" Mar 18 10:46:00 crc kubenswrapper[4778]: E0318 10:46:00.183190 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="registry-server" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.183256 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="registry-server" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.183516 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="registry-server" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.184318 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.187360 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.187418 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.187924 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.200756 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563846-jwrl9"] Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.359416 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slpdj\" (UniqueName: \"kubernetes.io/projected/2ed10f0a-3d2d-483e-9532-dd1f7b38631b-kube-api-access-slpdj\") pod \"auto-csr-approver-29563846-jwrl9\" (UID: \"2ed10f0a-3d2d-483e-9532-dd1f7b38631b\") " pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.461889 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slpdj\" (UniqueName: \"kubernetes.io/projected/2ed10f0a-3d2d-483e-9532-dd1f7b38631b-kube-api-access-slpdj\") pod \"auto-csr-approver-29563846-jwrl9\" (UID: \"2ed10f0a-3d2d-483e-9532-dd1f7b38631b\") " pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.486871 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slpdj\" (UniqueName: \"kubernetes.io/projected/2ed10f0a-3d2d-483e-9532-dd1f7b38631b-kube-api-access-slpdj\") pod \"auto-csr-approver-29563846-jwrl9\" (UID: \"2ed10f0a-3d2d-483e-9532-dd1f7b38631b\") " pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.504944 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.969949 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563846-jwrl9"] Mar 18 10:46:01 crc kubenswrapper[4778]: I0318 10:46:01.626112 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" event={"ID":"2ed10f0a-3d2d-483e-9532-dd1f7b38631b","Type":"ContainerStarted","Data":"2eb014649fd8d480769737cbee9ea9c54962845af511b22a4fdca375f6c6cb50"} Mar 18 10:46:02 crc kubenswrapper[4778]: I0318 10:46:02.636940 4778 generic.go:334] "Generic (PLEG): container finished" podID="2ed10f0a-3d2d-483e-9532-dd1f7b38631b" containerID="a51630f7ff38c957b6d8be33f92679338164d3fd19d236304cf23699728f1e4b" exitCode=0 Mar 18 10:46:02 crc kubenswrapper[4778]: I0318 10:46:02.637012 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" event={"ID":"2ed10f0a-3d2d-483e-9532-dd1f7b38631b","Type":"ContainerDied","Data":"a51630f7ff38c957b6d8be33f92679338164d3fd19d236304cf23699728f1e4b"} Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.028107 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.139102 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slpdj\" (UniqueName: \"kubernetes.io/projected/2ed10f0a-3d2d-483e-9532-dd1f7b38631b-kube-api-access-slpdj\") pod \"2ed10f0a-3d2d-483e-9532-dd1f7b38631b\" (UID: \"2ed10f0a-3d2d-483e-9532-dd1f7b38631b\") " Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.161532 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed10f0a-3d2d-483e-9532-dd1f7b38631b-kube-api-access-slpdj" (OuterVolumeSpecName: "kube-api-access-slpdj") pod "2ed10f0a-3d2d-483e-9532-dd1f7b38631b" (UID: "2ed10f0a-3d2d-483e-9532-dd1f7b38631b"). InnerVolumeSpecName "kube-api-access-slpdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.242220 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slpdj\" (UniqueName: \"kubernetes.io/projected/2ed10f0a-3d2d-483e-9532-dd1f7b38631b-kube-api-access-slpdj\") on node \"crc\" DevicePath \"\"" Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.661742 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" event={"ID":"2ed10f0a-3d2d-483e-9532-dd1f7b38631b","Type":"ContainerDied","Data":"2eb014649fd8d480769737cbee9ea9c54962845af511b22a4fdca375f6c6cb50"} Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.661782 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eb014649fd8d480769737cbee9ea9c54962845af511b22a4fdca375f6c6cb50" Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.662289 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:05 crc kubenswrapper[4778]: I0318 10:46:05.108626 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563840-fv5mm"] Mar 18 10:46:05 crc kubenswrapper[4778]: I0318 10:46:05.119449 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563840-fv5mm"] Mar 18 10:46:06 crc kubenswrapper[4778]: I0318 10:46:06.202389 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc3d0dd7-380a-4f1c-bb78-f8df1a73362e" path="/var/lib/kubelet/pods/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e/volumes" Mar 18 10:46:30 crc kubenswrapper[4778]: I0318 10:46:30.147560 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:46:30 crc kubenswrapper[4778]: I0318 10:46:30.148225 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:46:36 crc kubenswrapper[4778]: I0318 10:46:36.807735 4778 scope.go:117] "RemoveContainer" containerID="105700f78835bb2b225e76573d66e982ecb74a475715ed5f6fa69ca1e19eafce" Mar 18 10:47:00 crc kubenswrapper[4778]: I0318 10:47:00.147116 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:47:00 crc kubenswrapper[4778]: I0318 10:47:00.147865 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.147603 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.148509 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.148591 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.149935 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90db09640ef7e7d8376edb316416510275b2a613d72a3ab935ed1da863b06852"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.150054 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://90db09640ef7e7d8376edb316416510275b2a613d72a3ab935ed1da863b06852" gracePeriod=600 Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.633867 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="90db09640ef7e7d8376edb316416510275b2a613d72a3ab935ed1da863b06852" exitCode=0 Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.634239 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"90db09640ef7e7d8376edb316416510275b2a613d72a3ab935ed1da863b06852"} Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.634279 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa"} Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.634299 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.055762 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9z6mr"] Mar 18 10:47:33 crc kubenswrapper[4778]: E0318 10:47:33.056804 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed10f0a-3d2d-483e-9532-dd1f7b38631b" containerName="oc" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.056822 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed10f0a-3d2d-483e-9532-dd1f7b38631b" containerName="oc" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.057081 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed10f0a-3d2d-483e-9532-dd1f7b38631b" containerName="oc" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.059161 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.084104 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9z6mr"] Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.225140 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvlt8\" (UniqueName: \"kubernetes.io/projected/6bd497db-2065-4c53-8a1e-1499f18fb717-kube-api-access-rvlt8\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.225625 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-utilities\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.225768 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-catalog-content\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.327604 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-utilities\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.327707 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-catalog-content\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.327736 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvlt8\" (UniqueName: \"kubernetes.io/projected/6bd497db-2065-4c53-8a1e-1499f18fb717-kube-api-access-rvlt8\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.328381 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-catalog-content\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.328540 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-utilities\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.352049 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvlt8\" (UniqueName: \"kubernetes.io/projected/6bd497db-2065-4c53-8a1e-1499f18fb717-kube-api-access-rvlt8\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.393350 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.953908 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9z6mr"] Mar 18 10:47:34 crc kubenswrapper[4778]: I0318 10:47:34.680937 4778 generic.go:334] "Generic (PLEG): container finished" podID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerID="87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5" exitCode=0 Mar 18 10:47:34 crc kubenswrapper[4778]: I0318 10:47:34.681280 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerDied","Data":"87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5"} Mar 18 10:47:34 crc kubenswrapper[4778]: I0318 10:47:34.681308 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerStarted","Data":"bd6bd7135302f1dcdf1cf4e0ba01837911d65c1b90f886c2591d77d03440840e"} Mar 18 10:47:35 crc kubenswrapper[4778]: I0318 10:47:35.691041 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerStarted","Data":"2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064"} Mar 18 10:47:35 crc kubenswrapper[4778]: I0318 10:47:35.694116 4778 generic.go:334] "Generic (PLEG): container finished" podID="757e3758-d646-4267-8c4c-b5efb0dcf709" containerID="c559ae3a1e4423e99c37d72f15f18f3cd16bc2838d62270df411dbac2afa6c1e" exitCode=0 Mar 18 10:47:35 crc kubenswrapper[4778]: I0318 10:47:35.694168 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"757e3758-d646-4267-8c4c-b5efb0dcf709","Type":"ContainerDied","Data":"c559ae3a1e4423e99c37d72f15f18f3cd16bc2838d62270df411dbac2afa6c1e"} Mar 18 10:47:36 crc kubenswrapper[4778]: I0318 10:47:36.706465 4778 generic.go:334] "Generic (PLEG): container finished" podID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerID="2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064" exitCode=0 Mar 18 10:47:36 crc kubenswrapper[4778]: I0318 10:47:36.706570 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerDied","Data":"2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064"} Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.542442 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.632144 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Mar 18 10:47:37 crc kubenswrapper[4778]: E0318 10:47:37.646415 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757e3758-d646-4267-8c4c-b5efb0dcf709" containerName="tempest-tests-tempest-tests-runner" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.646536 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="757e3758-d646-4267-8c4c-b5efb0dcf709" containerName="tempest-tests-tempest-tests-runner" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.649187 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="757e3758-d646-4267-8c4c-b5efb0dcf709" containerName="tempest-tests-tempest-tests-runner" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.654030 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.670341 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s1" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.670680 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s1" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.671223 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.716419 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"757e3758-d646-4267-8c4c-b5efb0dcf709","Type":"ContainerDied","Data":"dcac98cd78d62b2f03dd429a022d38c29d36c13fc170b830ecbd627ba6023d27"} Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.716474 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcac98cd78d62b2f03dd429a022d38c29d36c13fc170b830ecbd627ba6023d27" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.716731 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717290 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-config-data\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717389 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717505 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ssh-key\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717597 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config-secret\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717615 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ca-certs\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717664 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-temporary\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717703 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c89k6\" (UniqueName: \"kubernetes.io/projected/757e3758-d646-4267-8c4c-b5efb0dcf709-kube-api-access-c89k6\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717719 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717740 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-workdir\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717756 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ceph\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.720856 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-config-data" (OuterVolumeSpecName: "config-data") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.721076 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerStarted","Data":"872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142"} Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.721636 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.728538 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.731688 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.738433 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757e3758-d646-4267-8c4c-b5efb0dcf709-kube-api-access-c89k6" (OuterVolumeSpecName: "kube-api-access-c89k6") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "kube-api-access-c89k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.739237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ceph" (OuterVolumeSpecName: "ceph") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.750143 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.757119 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.772041 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.774914 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.819491 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx57k\" (UniqueName: \"kubernetes.io/projected/c5a7a532-f8c2-4741-9892-65047a4cb225-kube-api-access-sx57k\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.819535 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.819570 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.819756 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.819866 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.819906 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820017 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820069 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820104 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820142 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820404 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c89k6\" (UniqueName: \"kubernetes.io/projected/757e3758-d646-4267-8c4c-b5efb0dcf709-kube-api-access-c89k6\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820482 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820505 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820521 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820537 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820547 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820556 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820565 4778 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820575 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.848656 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922006 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922050 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922132 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922168 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922261 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx57k\" (UniqueName: \"kubernetes.io/projected/c5a7a532-f8c2-4741-9892-65047a4cb225-kube-api-access-sx57k\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922277 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922303 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922335 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922968 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.923300 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.923860 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.924082 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.926344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.926748 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.927215 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.929077 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.938010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx57k\" (UniqueName: \"kubernetes.io/projected/c5a7a532-f8c2-4741-9892-65047a4cb225-kube-api-access-sx57k\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.991162 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:38 crc kubenswrapper[4778]: I0318 10:47:38.544023 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9z6mr" podStartSLOduration=2.943335798 podStartE2EDuration="5.544001582s" podCreationTimestamp="2026-03-18 10:47:33 +0000 UTC" firstStartedPulling="2026-03-18 10:47:34.683316296 +0000 UTC m=+6321.258061136" lastFinishedPulling="2026-03-18 10:47:37.28398207 +0000 UTC m=+6323.858726920" observedRunningTime="2026-03-18 10:47:37.761790391 +0000 UTC m=+6324.336535241" watchObservedRunningTime="2026-03-18 10:47:38.544001582 +0000 UTC m=+6325.118746422" Mar 18 10:47:38 crc kubenswrapper[4778]: I0318 10:47:38.547224 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Mar 18 10:47:38 crc kubenswrapper[4778]: I0318 10:47:38.730843 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"c5a7a532-f8c2-4741-9892-65047a4cb225","Type":"ContainerStarted","Data":"1ee8cf024ce4398d1f0ed48d786bbb6b3add9e2f95a7fd4bf27b0fad0caf4251"} Mar 18 10:47:39 crc kubenswrapper[4778]: I0318 10:47:39.741232 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"c5a7a532-f8c2-4741-9892-65047a4cb225","Type":"ContainerStarted","Data":"44ebfbf33b960c39e1ffc52c3185dc0dc1ec7c33f6f6b2ba0c1b6ca80065a482"} Mar 18 10:47:39 crc kubenswrapper[4778]: I0318 10:47:39.768090 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s01-single-test" podStartSLOduration=2.768072983 podStartE2EDuration="2.768072983s" podCreationTimestamp="2026-03-18 10:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:47:39.764717282 +0000 UTC m=+6326.339462142" watchObservedRunningTime="2026-03-18 10:47:39.768072983 +0000 UTC m=+6326.342817823" Mar 18 10:47:43 crc kubenswrapper[4778]: I0318 10:47:43.393754 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:43 crc kubenswrapper[4778]: I0318 10:47:43.395962 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:43 crc kubenswrapper[4778]: I0318 10:47:43.454483 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:43 crc kubenswrapper[4778]: I0318 10:47:43.834777 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:43 crc kubenswrapper[4778]: I0318 10:47:43.890120 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9z6mr"] Mar 18 10:47:45 crc kubenswrapper[4778]: I0318 10:47:45.807937 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9z6mr" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="registry-server" containerID="cri-o://872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142" gracePeriod=2 Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.783763 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.825751 4778 generic.go:334] "Generic (PLEG): container finished" podID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerID="872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142" exitCode=0 Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.826174 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerDied","Data":"872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142"} Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.826238 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerDied","Data":"bd6bd7135302f1dcdf1cf4e0ba01837911d65c1b90f886c2591d77d03440840e"} Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.826264 4778 scope.go:117] "RemoveContainer" containerID="872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.826330 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.867693 4778 scope.go:117] "RemoveContainer" containerID="2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.911677 4778 scope.go:117] "RemoveContainer" containerID="87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.915859 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-utilities\") pod \"6bd497db-2065-4c53-8a1e-1499f18fb717\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.915963 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-catalog-content\") pod \"6bd497db-2065-4c53-8a1e-1499f18fb717\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.916159 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvlt8\" (UniqueName: \"kubernetes.io/projected/6bd497db-2065-4c53-8a1e-1499f18fb717-kube-api-access-rvlt8\") pod \"6bd497db-2065-4c53-8a1e-1499f18fb717\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.918083 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-utilities" (OuterVolumeSpecName: "utilities") pod "6bd497db-2065-4c53-8a1e-1499f18fb717" (UID: "6bd497db-2065-4c53-8a1e-1499f18fb717"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.922130 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd497db-2065-4c53-8a1e-1499f18fb717-kube-api-access-rvlt8" (OuterVolumeSpecName: "kube-api-access-rvlt8") pod "6bd497db-2065-4c53-8a1e-1499f18fb717" (UID: "6bd497db-2065-4c53-8a1e-1499f18fb717"). InnerVolumeSpecName "kube-api-access-rvlt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.974573 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bd497db-2065-4c53-8a1e-1499f18fb717" (UID: "6bd497db-2065-4c53-8a1e-1499f18fb717"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.005543 4778 scope.go:117] "RemoveContainer" containerID="872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142" Mar 18 10:47:47 crc kubenswrapper[4778]: E0318 10:47:47.005886 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142\": container with ID starting with 872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142 not found: ID does not exist" containerID="872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.005940 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142"} err="failed to get container status \"872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142\": rpc error: code = NotFound desc = could not find container \"872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142\": container with ID starting with 872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142 not found: ID does not exist" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.005964 4778 scope.go:117] "RemoveContainer" containerID="2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064" Mar 18 10:47:47 crc kubenswrapper[4778]: E0318 10:47:47.006414 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064\": container with ID starting with 2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064 not found: ID does not exist" containerID="2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.006472 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064"} err="failed to get container status \"2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064\": rpc error: code = NotFound desc = could not find container \"2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064\": container with ID starting with 2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064 not found: ID does not exist" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.006504 4778 scope.go:117] "RemoveContainer" containerID="87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5" Mar 18 10:47:47 crc kubenswrapper[4778]: E0318 10:47:47.006792 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5\": container with ID starting with 87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5 not found: ID does not exist" containerID="87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.006814 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5"} err="failed to get container status \"87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5\": rpc error: code = NotFound desc = could not find container \"87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5\": container with ID starting with 87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5 not found: ID does not exist" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.019962 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.019987 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.019997 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvlt8\" (UniqueName: \"kubernetes.io/projected/6bd497db-2065-4c53-8a1e-1499f18fb717-kube-api-access-rvlt8\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.161629 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9z6mr"] Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.171056 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9z6mr"] Mar 18 10:47:48 crc kubenswrapper[4778]: I0318 10:47:48.206689 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" path="/var/lib/kubelet/pods/6bd497db-2065-4c53-8a1e-1499f18fb717/volumes" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.148569 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563848-f28cg"] Mar 18 10:48:00 crc kubenswrapper[4778]: E0318 10:48:00.149290 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="registry-server" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.149302 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="registry-server" Mar 18 10:48:00 crc kubenswrapper[4778]: E0318 10:48:00.149328 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="extract-content" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.149335 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="extract-content" Mar 18 10:48:00 crc kubenswrapper[4778]: E0318 10:48:00.149355 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="extract-utilities" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.149361 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="extract-utilities" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.149529 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="registry-server" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.150102 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.170039 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.170560 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.170992 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.174503 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563848-f28cg"] Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.294073 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g7w4\" (UniqueName: \"kubernetes.io/projected/49327474-2bad-4ebc-b955-bf9dd1268c5e-kube-api-access-2g7w4\") pod \"auto-csr-approver-29563848-f28cg\" (UID: \"49327474-2bad-4ebc-b955-bf9dd1268c5e\") " pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.397267 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g7w4\" (UniqueName: \"kubernetes.io/projected/49327474-2bad-4ebc-b955-bf9dd1268c5e-kube-api-access-2g7w4\") pod \"auto-csr-approver-29563848-f28cg\" (UID: \"49327474-2bad-4ebc-b955-bf9dd1268c5e\") " pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.432242 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g7w4\" (UniqueName: \"kubernetes.io/projected/49327474-2bad-4ebc-b955-bf9dd1268c5e-kube-api-access-2g7w4\") pod \"auto-csr-approver-29563848-f28cg\" (UID: \"49327474-2bad-4ebc-b955-bf9dd1268c5e\") " pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.467963 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:01 crc kubenswrapper[4778]: I0318 10:48:01.006901 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563848-f28cg"] Mar 18 10:48:01 crc kubenswrapper[4778]: I0318 10:48:01.983125 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563848-f28cg" event={"ID":"49327474-2bad-4ebc-b955-bf9dd1268c5e","Type":"ContainerStarted","Data":"9dd2005bf3609ca5c74b14f29aa73689791f9e708f340d3d96513f2c3aef7ae9"} Mar 18 10:48:02 crc kubenswrapper[4778]: I0318 10:48:02.993333 4778 generic.go:334] "Generic (PLEG): container finished" podID="49327474-2bad-4ebc-b955-bf9dd1268c5e" containerID="f8a6db8312e875033fdd1f73d7983e85a274f3fbde6864a2faaec123c194e5c8" exitCode=0 Mar 18 10:48:02 crc kubenswrapper[4778]: I0318 10:48:02.993402 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563848-f28cg" event={"ID":"49327474-2bad-4ebc-b955-bf9dd1268c5e","Type":"ContainerDied","Data":"f8a6db8312e875033fdd1f73d7983e85a274f3fbde6864a2faaec123c194e5c8"} Mar 18 10:48:04 crc kubenswrapper[4778]: I0318 10:48:04.380538 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:04 crc kubenswrapper[4778]: I0318 10:48:04.486165 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g7w4\" (UniqueName: \"kubernetes.io/projected/49327474-2bad-4ebc-b955-bf9dd1268c5e-kube-api-access-2g7w4\") pod \"49327474-2bad-4ebc-b955-bf9dd1268c5e\" (UID: \"49327474-2bad-4ebc-b955-bf9dd1268c5e\") " Mar 18 10:48:04 crc kubenswrapper[4778]: I0318 10:48:04.500345 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49327474-2bad-4ebc-b955-bf9dd1268c5e-kube-api-access-2g7w4" (OuterVolumeSpecName: "kube-api-access-2g7w4") pod "49327474-2bad-4ebc-b955-bf9dd1268c5e" (UID: "49327474-2bad-4ebc-b955-bf9dd1268c5e"). InnerVolumeSpecName "kube-api-access-2g7w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:48:04 crc kubenswrapper[4778]: I0318 10:48:04.590692 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g7w4\" (UniqueName: \"kubernetes.io/projected/49327474-2bad-4ebc-b955-bf9dd1268c5e-kube-api-access-2g7w4\") on node \"crc\" DevicePath \"\"" Mar 18 10:48:05 crc kubenswrapper[4778]: I0318 10:48:05.017268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563848-f28cg" event={"ID":"49327474-2bad-4ebc-b955-bf9dd1268c5e","Type":"ContainerDied","Data":"9dd2005bf3609ca5c74b14f29aa73689791f9e708f340d3d96513f2c3aef7ae9"} Mar 18 10:48:05 crc kubenswrapper[4778]: I0318 10:48:05.017576 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dd2005bf3609ca5c74b14f29aa73689791f9e708f340d3d96513f2c3aef7ae9" Mar 18 10:48:05 crc kubenswrapper[4778]: I0318 10:48:05.017369 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:05 crc kubenswrapper[4778]: I0318 10:48:05.456677 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563842-ckkmp"] Mar 18 10:48:05 crc kubenswrapper[4778]: I0318 10:48:05.464025 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563842-ckkmp"] Mar 18 10:48:06 crc kubenswrapper[4778]: I0318 10:48:06.198711 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d10faaed-ffef-4afb-9f75-262e4fccd22a" path="/var/lib/kubelet/pods/d10faaed-ffef-4afb-9f75-262e4fccd22a/volumes" Mar 18 10:48:36 crc kubenswrapper[4778]: I0318 10:48:36.914648 4778 scope.go:117] "RemoveContainer" containerID="cec3fa048e9699e703eb8a3404384f6f46bb6a98f37648f4a97cf2fe11dab009" Mar 18 10:49:05 crc kubenswrapper[4778]: I0318 10:49:05.988538 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pgmtp"] Mar 18 10:49:05 crc kubenswrapper[4778]: E0318 10:49:05.989706 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49327474-2bad-4ebc-b955-bf9dd1268c5e" containerName="oc" Mar 18 10:49:05 crc kubenswrapper[4778]: I0318 10:49:05.989731 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="49327474-2bad-4ebc-b955-bf9dd1268c5e" containerName="oc" Mar 18 10:49:05 crc kubenswrapper[4778]: I0318 10:49:05.990063 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="49327474-2bad-4ebc-b955-bf9dd1268c5e" containerName="oc" Mar 18 10:49:05 crc kubenswrapper[4778]: I0318 10:49:05.992486 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.011131 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pgmtp"] Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.044367 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-catalog-content\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.044873 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jspvx\" (UniqueName: \"kubernetes.io/projected/0cdec5ae-a923-4018-9a0b-400916a4273f-kube-api-access-jspvx\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.044938 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-utilities\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.146345 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jspvx\" (UniqueName: \"kubernetes.io/projected/0cdec5ae-a923-4018-9a0b-400916a4273f-kube-api-access-jspvx\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.146387 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-utilities\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.146423 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-catalog-content\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.147025 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-catalog-content\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.147066 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-utilities\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.170168 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jspvx\" (UniqueName: \"kubernetes.io/projected/0cdec5ae-a923-4018-9a0b-400916a4273f-kube-api-access-jspvx\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.331950 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.790291 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pgmtp"] Mar 18 10:49:07 crc kubenswrapper[4778]: I0318 10:49:07.779185 4778 generic.go:334] "Generic (PLEG): container finished" podID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerID="66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9" exitCode=0 Mar 18 10:49:07 crc kubenswrapper[4778]: I0318 10:49:07.779268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerDied","Data":"66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9"} Mar 18 10:49:07 crc kubenswrapper[4778]: I0318 10:49:07.782475 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerStarted","Data":"1aec8bdc1619f777aa2c0574af940f8616b13917329f5392e60aba7ef8b31165"} Mar 18 10:49:07 crc kubenswrapper[4778]: I0318 10:49:07.781652 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:49:09 crc kubenswrapper[4778]: I0318 10:49:09.806467 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerStarted","Data":"9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50"} Mar 18 10:49:10 crc kubenswrapper[4778]: I0318 10:49:10.819991 4778 generic.go:334] "Generic (PLEG): container finished" podID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerID="9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50" exitCode=0 Mar 18 10:49:10 crc kubenswrapper[4778]: I0318 10:49:10.820053 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerDied","Data":"9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50"} Mar 18 10:49:11 crc kubenswrapper[4778]: I0318 10:49:11.833953 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerStarted","Data":"ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b"} Mar 18 10:49:11 crc kubenswrapper[4778]: I0318 10:49:11.863711 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pgmtp" podStartSLOduration=3.290509345 podStartE2EDuration="6.863688217s" podCreationTimestamp="2026-03-18 10:49:05 +0000 UTC" firstStartedPulling="2026-03-18 10:49:07.781374108 +0000 UTC m=+6414.356118958" lastFinishedPulling="2026-03-18 10:49:11.35455299 +0000 UTC m=+6417.929297830" observedRunningTime="2026-03-18 10:49:11.858090246 +0000 UTC m=+6418.432835076" watchObservedRunningTime="2026-03-18 10:49:11.863688217 +0000 UTC m=+6418.438433067" Mar 18 10:49:16 crc kubenswrapper[4778]: I0318 10:49:16.333105 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:16 crc kubenswrapper[4778]: I0318 10:49:16.333689 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:17 crc kubenswrapper[4778]: I0318 10:49:17.391295 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pgmtp" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="registry-server" probeResult="failure" output=< Mar 18 10:49:17 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:49:17 crc kubenswrapper[4778]: > Mar 18 10:49:26 crc kubenswrapper[4778]: I0318 10:49:26.402249 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:26 crc kubenswrapper[4778]: I0318 10:49:26.507418 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:26 crc kubenswrapper[4778]: I0318 10:49:26.652627 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pgmtp"] Mar 18 10:49:27 crc kubenswrapper[4778]: I0318 10:49:27.989664 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pgmtp" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="registry-server" containerID="cri-o://ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b" gracePeriod=2 Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.531613 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.638251 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-utilities\") pod \"0cdec5ae-a923-4018-9a0b-400916a4273f\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.638374 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-catalog-content\") pod \"0cdec5ae-a923-4018-9a0b-400916a4273f\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.638613 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jspvx\" (UniqueName: \"kubernetes.io/projected/0cdec5ae-a923-4018-9a0b-400916a4273f-kube-api-access-jspvx\") pod \"0cdec5ae-a923-4018-9a0b-400916a4273f\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.639153 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-utilities" (OuterVolumeSpecName: "utilities") pod "0cdec5ae-a923-4018-9a0b-400916a4273f" (UID: "0cdec5ae-a923-4018-9a0b-400916a4273f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.643943 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cdec5ae-a923-4018-9a0b-400916a4273f-kube-api-access-jspvx" (OuterVolumeSpecName: "kube-api-access-jspvx") pod "0cdec5ae-a923-4018-9a0b-400916a4273f" (UID: "0cdec5ae-a923-4018-9a0b-400916a4273f"). InnerVolumeSpecName "kube-api-access-jspvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.740861 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jspvx\" (UniqueName: \"kubernetes.io/projected/0cdec5ae-a923-4018-9a0b-400916a4273f-kube-api-access-jspvx\") on node \"crc\" DevicePath \"\"" Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.741109 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.799970 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cdec5ae-a923-4018-9a0b-400916a4273f" (UID: "0cdec5ae-a923-4018-9a0b-400916a4273f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.842558 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.006406 4778 generic.go:334] "Generic (PLEG): container finished" podID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerID="ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b" exitCode=0 Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.006444 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerDied","Data":"ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b"} Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.006473 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerDied","Data":"1aec8bdc1619f777aa2c0574af940f8616b13917329f5392e60aba7ef8b31165"} Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.006490 4778 scope.go:117] "RemoveContainer" containerID="ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.007401 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.049990 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pgmtp"] Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.051754 4778 scope.go:117] "RemoveContainer" containerID="9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.061057 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pgmtp"] Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.077956 4778 scope.go:117] "RemoveContainer" containerID="66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.136339 4778 scope.go:117] "RemoveContainer" containerID="ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b" Mar 18 10:49:29 crc kubenswrapper[4778]: E0318 10:49:29.136761 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b\": container with ID starting with ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b not found: ID does not exist" containerID="ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.136817 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b"} err="failed to get container status \"ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b\": rpc error: code = NotFound desc = could not find container \"ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b\": container with ID starting with ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b not found: ID does not exist" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.136850 4778 scope.go:117] "RemoveContainer" containerID="9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50" Mar 18 10:49:29 crc kubenswrapper[4778]: E0318 10:49:29.137230 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50\": container with ID starting with 9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50 not found: ID does not exist" containerID="9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.137262 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50"} err="failed to get container status \"9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50\": rpc error: code = NotFound desc = could not find container \"9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50\": container with ID starting with 9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50 not found: ID does not exist" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.137300 4778 scope.go:117] "RemoveContainer" containerID="66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9" Mar 18 10:49:29 crc kubenswrapper[4778]: E0318 10:49:29.137551 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9\": container with ID starting with 66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9 not found: ID does not exist" containerID="66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.137579 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9"} err="failed to get container status \"66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9\": rpc error: code = NotFound desc = could not find container \"66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9\": container with ID starting with 66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9 not found: ID does not exist" Mar 18 10:49:30 crc kubenswrapper[4778]: I0318 10:49:30.147418 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:49:30 crc kubenswrapper[4778]: I0318 10:49:30.147770 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:49:30 crc kubenswrapper[4778]: I0318 10:49:30.197396 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" path="/var/lib/kubelet/pods/0cdec5ae-a923-4018-9a0b-400916a4273f/volumes" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.147193 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563850-jrcnr"] Mar 18 10:50:00 crc kubenswrapper[4778]: E0318 10:50:00.148066 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="extract-content" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.148079 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="extract-content" Mar 18 10:50:00 crc kubenswrapper[4778]: E0318 10:50:00.148097 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="registry-server" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.148103 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="registry-server" Mar 18 10:50:00 crc kubenswrapper[4778]: E0318 10:50:00.148116 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="extract-utilities" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.148123 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="extract-utilities" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.148328 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="registry-server" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.149063 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.147653 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.149612 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.152937 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.154004 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.154262 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.169841 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563850-jrcnr"] Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.207504 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwmkn\" (UniqueName: \"kubernetes.io/projected/fc499d31-e373-413b-8a38-1fa69f007f2f-kube-api-access-bwmkn\") pod \"auto-csr-approver-29563850-jrcnr\" (UID: \"fc499d31-e373-413b-8a38-1fa69f007f2f\") " pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.310872 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwmkn\" (UniqueName: \"kubernetes.io/projected/fc499d31-e373-413b-8a38-1fa69f007f2f-kube-api-access-bwmkn\") pod \"auto-csr-approver-29563850-jrcnr\" (UID: \"fc499d31-e373-413b-8a38-1fa69f007f2f\") " pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.328830 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwmkn\" (UniqueName: \"kubernetes.io/projected/fc499d31-e373-413b-8a38-1fa69f007f2f-kube-api-access-bwmkn\") pod \"auto-csr-approver-29563850-jrcnr\" (UID: \"fc499d31-e373-413b-8a38-1fa69f007f2f\") " pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.472938 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:01 crc kubenswrapper[4778]: I0318 10:50:01.015985 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563850-jrcnr"] Mar 18 10:50:01 crc kubenswrapper[4778]: W0318 10:50:01.023414 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc499d31_e373_413b_8a38_1fa69f007f2f.slice/crio-3f2a027b89707a6aaf4202a17985fbf462d8b0b22415b14007ece0b2c1942cf3 WatchSource:0}: Error finding container 3f2a027b89707a6aaf4202a17985fbf462d8b0b22415b14007ece0b2c1942cf3: Status 404 returned error can't find the container with id 3f2a027b89707a6aaf4202a17985fbf462d8b0b22415b14007ece0b2c1942cf3 Mar 18 10:50:01 crc kubenswrapper[4778]: I0318 10:50:01.353766 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" event={"ID":"fc499d31-e373-413b-8a38-1fa69f007f2f","Type":"ContainerStarted","Data":"3f2a027b89707a6aaf4202a17985fbf462d8b0b22415b14007ece0b2c1942cf3"} Mar 18 10:50:03 crc kubenswrapper[4778]: I0318 10:50:03.378681 4778 generic.go:334] "Generic (PLEG): container finished" podID="fc499d31-e373-413b-8a38-1fa69f007f2f" containerID="f14aa13ab7520a46eb0a2d95277ca80a41dbb7bc18bd9145e42c1b01a855cabf" exitCode=0 Mar 18 10:50:03 crc kubenswrapper[4778]: I0318 10:50:03.378754 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" event={"ID":"fc499d31-e373-413b-8a38-1fa69f007f2f","Type":"ContainerDied","Data":"f14aa13ab7520a46eb0a2d95277ca80a41dbb7bc18bd9145e42c1b01a855cabf"} Mar 18 10:50:04 crc kubenswrapper[4778]: I0318 10:50:04.877453 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.071898 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwmkn\" (UniqueName: \"kubernetes.io/projected/fc499d31-e373-413b-8a38-1fa69f007f2f-kube-api-access-bwmkn\") pod \"fc499d31-e373-413b-8a38-1fa69f007f2f\" (UID: \"fc499d31-e373-413b-8a38-1fa69f007f2f\") " Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.079383 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc499d31-e373-413b-8a38-1fa69f007f2f-kube-api-access-bwmkn" (OuterVolumeSpecName: "kube-api-access-bwmkn") pod "fc499d31-e373-413b-8a38-1fa69f007f2f" (UID: "fc499d31-e373-413b-8a38-1fa69f007f2f"). InnerVolumeSpecName "kube-api-access-bwmkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.174170 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwmkn\" (UniqueName: \"kubernetes.io/projected/fc499d31-e373-413b-8a38-1fa69f007f2f-kube-api-access-bwmkn\") on node \"crc\" DevicePath \"\"" Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.403261 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" event={"ID":"fc499d31-e373-413b-8a38-1fa69f007f2f","Type":"ContainerDied","Data":"3f2a027b89707a6aaf4202a17985fbf462d8b0b22415b14007ece0b2c1942cf3"} Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.403314 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f2a027b89707a6aaf4202a17985fbf462d8b0b22415b14007ece0b2c1942cf3" Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.403392 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.973777 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563844-j7kb9"] Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.983256 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563844-j7kb9"] Mar 18 10:50:06 crc kubenswrapper[4778]: I0318 10:50:06.202971 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2098dac3-962e-4d75-b22f-81aadc768dc6" path="/var/lib/kubelet/pods/2098dac3-962e-4d75-b22f-81aadc768dc6/volumes" Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.146956 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.147507 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.147558 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.148425 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.148495 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" gracePeriod=600 Mar 18 10:50:30 crc kubenswrapper[4778]: E0318 10:50:30.269274 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.677287 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" exitCode=0 Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.677363 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa"} Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.677509 4778 scope.go:117] "RemoveContainer" containerID="90db09640ef7e7d8376edb316416510275b2a613d72a3ab935ed1da863b06852" Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.678890 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:50:30 crc kubenswrapper[4778]: E0318 10:50:30.679339 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:50:37 crc kubenswrapper[4778]: I0318 10:50:37.063244 4778 scope.go:117] "RemoveContainer" containerID="88b38e0fbddd0bbafde023ebaf3f6bdcd76dd7a995f8e2d8d9c48c114d683213" Mar 18 10:50:42 crc kubenswrapper[4778]: I0318 10:50:42.188069 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:50:42 crc kubenswrapper[4778]: E0318 10:50:42.189384 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:50:56 crc kubenswrapper[4778]: I0318 10:50:56.187413 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:50:56 crc kubenswrapper[4778]: E0318 10:50:56.188813 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:51:07 crc kubenswrapper[4778]: I0318 10:51:07.188423 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:51:07 crc kubenswrapper[4778]: E0318 10:51:07.189718 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:51:19 crc kubenswrapper[4778]: I0318 10:51:19.188068 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:51:19 crc kubenswrapper[4778]: E0318 10:51:19.189078 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:51:32 crc kubenswrapper[4778]: I0318 10:51:32.187379 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:51:32 crc kubenswrapper[4778]: E0318 10:51:32.189226 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:51:47 crc kubenswrapper[4778]: I0318 10:51:47.188758 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:51:47 crc kubenswrapper[4778]: E0318 10:51:47.189744 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.145325 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563852-h8wf4"] Mar 18 10:52:00 crc kubenswrapper[4778]: E0318 10:52:00.146547 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc499d31-e373-413b-8a38-1fa69f007f2f" containerName="oc" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.146563 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc499d31-e373-413b-8a38-1fa69f007f2f" containerName="oc" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.146773 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc499d31-e373-413b-8a38-1fa69f007f2f" containerName="oc" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.147681 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.149519 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.150090 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.151498 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.155574 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563852-h8wf4"] Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.239912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh6d4\" (UniqueName: \"kubernetes.io/projected/902826c1-406d-4d16-8655-4a85ff4a3205-kube-api-access-bh6d4\") pod \"auto-csr-approver-29563852-h8wf4\" (UID: \"902826c1-406d-4d16-8655-4a85ff4a3205\") " pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.344176 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh6d4\" (UniqueName: \"kubernetes.io/projected/902826c1-406d-4d16-8655-4a85ff4a3205-kube-api-access-bh6d4\") pod \"auto-csr-approver-29563852-h8wf4\" (UID: \"902826c1-406d-4d16-8655-4a85ff4a3205\") " pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.367263 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh6d4\" (UniqueName: \"kubernetes.io/projected/902826c1-406d-4d16-8655-4a85ff4a3205-kube-api-access-bh6d4\") pod \"auto-csr-approver-29563852-h8wf4\" (UID: \"902826c1-406d-4d16-8655-4a85ff4a3205\") " pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.469956 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.926103 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563852-h8wf4"] Mar 18 10:52:01 crc kubenswrapper[4778]: I0318 10:52:01.613558 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" event={"ID":"902826c1-406d-4d16-8655-4a85ff4a3205","Type":"ContainerStarted","Data":"5edfd3af9b234546d9080f358abf5f5b13e6c14f3ab7072698599281d474283c"} Mar 18 10:52:02 crc kubenswrapper[4778]: I0318 10:52:02.187053 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:52:02 crc kubenswrapper[4778]: E0318 10:52:02.187563 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:52:02 crc kubenswrapper[4778]: I0318 10:52:02.628657 4778 generic.go:334] "Generic (PLEG): container finished" podID="902826c1-406d-4d16-8655-4a85ff4a3205" containerID="96707a6d26a4e59376b5ccb6d995399c8158c2cfc047ca02c91e8c3ceb00d6d6" exitCode=0 Mar 18 10:52:02 crc kubenswrapper[4778]: I0318 10:52:02.628742 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" event={"ID":"902826c1-406d-4d16-8655-4a85ff4a3205","Type":"ContainerDied","Data":"96707a6d26a4e59376b5ccb6d995399c8158c2cfc047ca02c91e8c3ceb00d6d6"} Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.025721 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.118429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh6d4\" (UniqueName: \"kubernetes.io/projected/902826c1-406d-4d16-8655-4a85ff4a3205-kube-api-access-bh6d4\") pod \"902826c1-406d-4d16-8655-4a85ff4a3205\" (UID: \"902826c1-406d-4d16-8655-4a85ff4a3205\") " Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.128048 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902826c1-406d-4d16-8655-4a85ff4a3205-kube-api-access-bh6d4" (OuterVolumeSpecName: "kube-api-access-bh6d4") pod "902826c1-406d-4d16-8655-4a85ff4a3205" (UID: "902826c1-406d-4d16-8655-4a85ff4a3205"). InnerVolumeSpecName "kube-api-access-bh6d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.221713 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh6d4\" (UniqueName: \"kubernetes.io/projected/902826c1-406d-4d16-8655-4a85ff4a3205-kube-api-access-bh6d4\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.649778 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" event={"ID":"902826c1-406d-4d16-8655-4a85ff4a3205","Type":"ContainerDied","Data":"5edfd3af9b234546d9080f358abf5f5b13e6c14f3ab7072698599281d474283c"} Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.649818 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5edfd3af9b234546d9080f358abf5f5b13e6c14f3ab7072698599281d474283c" Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.649872 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:05 crc kubenswrapper[4778]: I0318 10:52:05.112396 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563846-jwrl9"] Mar 18 10:52:05 crc kubenswrapper[4778]: I0318 10:52:05.124786 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563846-jwrl9"] Mar 18 10:52:05 crc kubenswrapper[4778]: I0318 10:52:05.663491 4778 generic.go:334] "Generic (PLEG): container finished" podID="c5a7a532-f8c2-4741-9892-65047a4cb225" containerID="44ebfbf33b960c39e1ffc52c3185dc0dc1ec7c33f6f6b2ba0c1b6ca80065a482" exitCode=0 Mar 18 10:52:05 crc kubenswrapper[4778]: I0318 10:52:05.663533 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"c5a7a532-f8c2-4741-9892-65047a4cb225","Type":"ContainerDied","Data":"44ebfbf33b960c39e1ffc52c3185dc0dc1ec7c33f6f6b2ba0c1b6ca80065a482"} Mar 18 10:52:06 crc kubenswrapper[4778]: I0318 10:52:06.197806 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed10f0a-3d2d-483e-9532-dd1f7b38631b" path="/var/lib/kubelet/pods/2ed10f0a-3d2d-483e-9532-dd1f7b38631b/volumes" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.086906 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205360 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-temporary\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205539 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ca-certs\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205602 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ceph\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205644 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-config-data\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205693 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205715 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx57k\" (UniqueName: \"kubernetes.io/projected/c5a7a532-f8c2-4741-9892-65047a4cb225-kube-api-access-sx57k\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205762 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205847 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-workdir\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205870 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config-secret\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205889 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ssh-key\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.206264 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.206380 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-config-data" (OuterVolumeSpecName: "config-data") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.206865 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.206889 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.211987 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.212177 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ceph" (OuterVolumeSpecName: "ceph") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.212595 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5a7a532-f8c2-4741-9892-65047a4cb225-kube-api-access-sx57k" (OuterVolumeSpecName: "kube-api-access-sx57k") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "kube-api-access-sx57k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.223760 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.234970 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.235470 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.237355 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.264365 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.309240 4778 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.309280 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.309326 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.309342 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx57k\" (UniqueName: \"kubernetes.io/projected/c5a7a532-f8c2-4741-9892-65047a4cb225-kube-api-access-sx57k\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.309358 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.319910 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.319927 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.319943 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.334427 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.422235 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.681626 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"c5a7a532-f8c2-4741-9892-65047a4cb225","Type":"ContainerDied","Data":"1ee8cf024ce4398d1f0ed48d786bbb6b3add9e2f95a7fd4bf27b0fad0caf4251"} Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.681708 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ee8cf024ce4398d1f0ed48d786bbb6b3add9e2f95a7fd4bf27b0fad0caf4251" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.681720 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.093936 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 10:52:17 crc kubenswrapper[4778]: E0318 10:52:17.095079 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a7a532-f8c2-4741-9892-65047a4cb225" containerName="tempest-tests-tempest-tests-runner" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.095100 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a7a532-f8c2-4741-9892-65047a4cb225" containerName="tempest-tests-tempest-tests-runner" Mar 18 10:52:17 crc kubenswrapper[4778]: E0318 10:52:17.095135 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902826c1-406d-4d16-8655-4a85ff4a3205" containerName="oc" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.095150 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="902826c1-406d-4d16-8655-4a85ff4a3205" containerName="oc" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.095506 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="902826c1-406d-4d16-8655-4a85ff4a3205" containerName="oc" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.095548 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a7a532-f8c2-4741-9892-65047a4cb225" containerName="tempest-tests-tempest-tests-runner" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.096512 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.099709 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-htxt6" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.112690 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.187663 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:52:17 crc kubenswrapper[4778]: E0318 10:52:17.187966 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.266102 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-562p4\" (UniqueName: \"kubernetes.io/projected/fb176b71-d782-4b0d-963f-94acef50cf11-kube-api-access-562p4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.266159 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.368033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-562p4\" (UniqueName: \"kubernetes.io/projected/fb176b71-d782-4b0d-963f-94acef50cf11-kube-api-access-562p4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.368327 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.368783 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.396758 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-562p4\" (UniqueName: \"kubernetes.io/projected/fb176b71-d782-4b0d-963f-94acef50cf11-kube-api-access-562p4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.404305 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.428698 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.891926 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 10:52:17 crc kubenswrapper[4778]: W0318 10:52:17.901223 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb176b71_d782_4b0d_963f_94acef50cf11.slice/crio-653d5a825531d8590579aa864537696a66078d42eb7ce52cda9b0eef0c268fd9 WatchSource:0}: Error finding container 653d5a825531d8590579aa864537696a66078d42eb7ce52cda9b0eef0c268fd9: Status 404 returned error can't find the container with id 653d5a825531d8590579aa864537696a66078d42eb7ce52cda9b0eef0c268fd9 Mar 18 10:52:18 crc kubenswrapper[4778]: I0318 10:52:18.797395 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"fb176b71-d782-4b0d-963f-94acef50cf11","Type":"ContainerStarted","Data":"653d5a825531d8590579aa864537696a66078d42eb7ce52cda9b0eef0c268fd9"} Mar 18 10:52:19 crc kubenswrapper[4778]: I0318 10:52:19.811583 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"fb176b71-d782-4b0d-963f-94acef50cf11","Type":"ContainerStarted","Data":"22db3cee620bf797616b27ba0cc8fb619ec2da9044209f42c698e36d77066088"} Mar 18 10:52:19 crc kubenswrapper[4778]: I0318 10:52:19.841878 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.912294358 podStartE2EDuration="2.841859875s" podCreationTimestamp="2026-03-18 10:52:17 +0000 UTC" firstStartedPulling="2026-03-18 10:52:17.904493526 +0000 UTC m=+6604.479238366" lastFinishedPulling="2026-03-18 10:52:18.834059043 +0000 UTC m=+6605.408803883" observedRunningTime="2026-03-18 10:52:19.831922956 +0000 UTC m=+6606.406667836" watchObservedRunningTime="2026-03-18 10:52:19.841859875 +0000 UTC m=+6606.416604715" Mar 18 10:52:31 crc kubenswrapper[4778]: I0318 10:52:31.187755 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:52:31 crc kubenswrapper[4778]: E0318 10:52:31.188932 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:52:37 crc kubenswrapper[4778]: I0318 10:52:37.155323 4778 scope.go:117] "RemoveContainer" containerID="a51630f7ff38c957b6d8be33f92679338164d3fd19d236304cf23699728f1e4b" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.893946 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.897050 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.901104 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobiko-tobiko-config-0" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.901256 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"tobiko-secret" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.901689 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobiko-tobiko-private-key-0" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.901748 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"test-operator-clouds-config" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.901759 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobiko-tobiko-public-key-0" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.906922 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050017 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050092 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxhtn\" (UniqueName: \"kubernetes.io/projected/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kube-api-access-dxhtn\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050124 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050181 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050421 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050464 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050539 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050598 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050692 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050900 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050956 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.051039 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152489 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152604 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152641 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152720 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152764 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152820 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxhtn\" (UniqueName: \"kubernetes.io/projected/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kube-api-access-dxhtn\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152864 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152903 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152939 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152962 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152986 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.153019 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.153688 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.153729 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.154666 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.154691 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.155773 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.156321 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.156626 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.159385 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.159635 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.167051 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.169356 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.171024 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxhtn\" (UniqueName: \"kubernetes.io/projected/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kube-api-access-dxhtn\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.180813 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.262749 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.879042 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Mar 18 10:52:40 crc kubenswrapper[4778]: I0318 10:52:40.052535 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557","Type":"ContainerStarted","Data":"aebcfed3451b3e3d8b172a76c2bb743ffd3f47051e3238bd0316471042774306"} Mar 18 10:52:44 crc kubenswrapper[4778]: I0318 10:52:44.196078 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:52:44 crc kubenswrapper[4778]: E0318 10:52:44.197067 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.600644 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6s7qj"] Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.603457 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.613079 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6s7qj"] Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.675668 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-utilities\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.675736 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2m7j\" (UniqueName: \"kubernetes.io/projected/c3bf4465-218c-43ec-84d3-9881b5d329ea-kube-api-access-x2m7j\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.675835 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-catalog-content\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.777705 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-utilities\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.777792 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2m7j\" (UniqueName: \"kubernetes.io/projected/c3bf4465-218c-43ec-84d3-9881b5d329ea-kube-api-access-x2m7j\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.777819 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-catalog-content\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.778480 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-catalog-content\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.778746 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-utilities\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.799222 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2m7j\" (UniqueName: \"kubernetes.io/projected/c3bf4465-218c-43ec-84d3-9881b5d329ea-kube-api-access-x2m7j\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.936427 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:54 crc kubenswrapper[4778]: I0318 10:52:54.165710 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6s7qj"] Mar 18 10:52:54 crc kubenswrapper[4778]: W0318 10:52:54.180567 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3bf4465_218c_43ec_84d3_9881b5d329ea.slice/crio-d13bbae561ed6205993db40c0a89f43d0f77fc3e9bdbd8f834db82cc8dcd4c54 WatchSource:0}: Error finding container d13bbae561ed6205993db40c0a89f43d0f77fc3e9bdbd8f834db82cc8dcd4c54: Status 404 returned error can't find the container with id d13bbae561ed6205993db40c0a89f43d0f77fc3e9bdbd8f834db82cc8dcd4c54 Mar 18 10:52:55 crc kubenswrapper[4778]: I0318 10:52:55.195852 4778 generic.go:334] "Generic (PLEG): container finished" podID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerID="7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb" exitCode=0 Mar 18 10:52:55 crc kubenswrapper[4778]: I0318 10:52:55.197680 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerDied","Data":"7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb"} Mar 18 10:52:55 crc kubenswrapper[4778]: I0318 10:52:55.197972 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerStarted","Data":"d13bbae561ed6205993db40c0a89f43d0f77fc3e9bdbd8f834db82cc8dcd4c54"} Mar 18 10:52:55 crc kubenswrapper[4778]: I0318 10:52:55.197988 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557","Type":"ContainerStarted","Data":"ba5823dd6f9a4d25d340574d43a268718dced1548735d14a1482f6123dc8e01d"} Mar 18 10:52:55 crc kubenswrapper[4778]: I0318 10:52:55.254115 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" podStartSLOduration=4.3420062999999995 podStartE2EDuration="18.254093787s" podCreationTimestamp="2026-03-18 10:52:37 +0000 UTC" firstStartedPulling="2026-03-18 10:52:39.889090731 +0000 UTC m=+6626.463835591" lastFinishedPulling="2026-03-18 10:52:53.801178198 +0000 UTC m=+6640.375923078" observedRunningTime="2026-03-18 10:52:55.242600936 +0000 UTC m=+6641.817345826" watchObservedRunningTime="2026-03-18 10:52:55.254093787 +0000 UTC m=+6641.828838637" Mar 18 10:52:56 crc kubenswrapper[4778]: I0318 10:52:56.212997 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerStarted","Data":"3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908"} Mar 18 10:52:57 crc kubenswrapper[4778]: I0318 10:52:57.220765 4778 generic.go:334] "Generic (PLEG): container finished" podID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerID="3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908" exitCode=0 Mar 18 10:52:57 crc kubenswrapper[4778]: I0318 10:52:57.220814 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerDied","Data":"3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908"} Mar 18 10:52:58 crc kubenswrapper[4778]: I0318 10:52:58.187891 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:52:58 crc kubenswrapper[4778]: E0318 10:52:58.188516 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:53:00 crc kubenswrapper[4778]: I0318 10:53:00.254324 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerStarted","Data":"e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff"} Mar 18 10:53:00 crc kubenswrapper[4778]: I0318 10:53:00.273019 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6s7qj" podStartSLOduration=7.133258259 podStartE2EDuration="11.273002993s" podCreationTimestamp="2026-03-18 10:52:49 +0000 UTC" firstStartedPulling="2026-03-18 10:52:55.19876377 +0000 UTC m=+6641.773508610" lastFinishedPulling="2026-03-18 10:52:59.338508504 +0000 UTC m=+6645.913253344" observedRunningTime="2026-03-18 10:53:00.270462245 +0000 UTC m=+6646.845207115" watchObservedRunningTime="2026-03-18 10:53:00.273002993 +0000 UTC m=+6646.847747833" Mar 18 10:53:09 crc kubenswrapper[4778]: I0318 10:53:09.963874 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:53:09 crc kubenswrapper[4778]: I0318 10:53:09.965088 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:53:10 crc kubenswrapper[4778]: I0318 10:53:10.038299 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:53:10 crc kubenswrapper[4778]: I0318 10:53:10.187046 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:53:10 crc kubenswrapper[4778]: E0318 10:53:10.187364 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:53:10 crc kubenswrapper[4778]: I0318 10:53:10.432769 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:53:10 crc kubenswrapper[4778]: I0318 10:53:10.500142 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6s7qj"] Mar 18 10:53:12 crc kubenswrapper[4778]: I0318 10:53:12.381284 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6s7qj" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="registry-server" containerID="cri-o://e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff" gracePeriod=2 Mar 18 10:53:12 crc kubenswrapper[4778]: I0318 10:53:12.841056 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.031254 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m7j\" (UniqueName: \"kubernetes.io/projected/c3bf4465-218c-43ec-84d3-9881b5d329ea-kube-api-access-x2m7j\") pod \"c3bf4465-218c-43ec-84d3-9881b5d329ea\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.031498 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-catalog-content\") pod \"c3bf4465-218c-43ec-84d3-9881b5d329ea\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.031535 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-utilities\") pod \"c3bf4465-218c-43ec-84d3-9881b5d329ea\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.032359 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-utilities" (OuterVolumeSpecName: "utilities") pod "c3bf4465-218c-43ec-84d3-9881b5d329ea" (UID: "c3bf4465-218c-43ec-84d3-9881b5d329ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.045943 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bf4465-218c-43ec-84d3-9881b5d329ea-kube-api-access-x2m7j" (OuterVolumeSpecName: "kube-api-access-x2m7j") pod "c3bf4465-218c-43ec-84d3-9881b5d329ea" (UID: "c3bf4465-218c-43ec-84d3-9881b5d329ea"). InnerVolumeSpecName "kube-api-access-x2m7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.115861 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3bf4465-218c-43ec-84d3-9881b5d329ea" (UID: "c3bf4465-218c-43ec-84d3-9881b5d329ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.133942 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.133978 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.133989 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m7j\" (UniqueName: \"kubernetes.io/projected/c3bf4465-218c-43ec-84d3-9881b5d329ea-kube-api-access-x2m7j\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.396755 4778 generic.go:334] "Generic (PLEG): container finished" podID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerID="e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff" exitCode=0 Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.396825 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.396851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerDied","Data":"e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff"} Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.397439 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerDied","Data":"d13bbae561ed6205993db40c0a89f43d0f77fc3e9bdbd8f834db82cc8dcd4c54"} Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.397556 4778 scope.go:117] "RemoveContainer" containerID="e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.427972 4778 scope.go:117] "RemoveContainer" containerID="3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.441530 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6s7qj"] Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.451645 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6s7qj"] Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.459164 4778 scope.go:117] "RemoveContainer" containerID="7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.511810 4778 scope.go:117] "RemoveContainer" containerID="e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff" Mar 18 10:53:13 crc kubenswrapper[4778]: E0318 10:53:13.512274 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff\": container with ID starting with e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff not found: ID does not exist" containerID="e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.512375 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff"} err="failed to get container status \"e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff\": rpc error: code = NotFound desc = could not find container \"e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff\": container with ID starting with e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff not found: ID does not exist" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.512399 4778 scope.go:117] "RemoveContainer" containerID="3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908" Mar 18 10:53:13 crc kubenswrapper[4778]: E0318 10:53:13.512735 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908\": container with ID starting with 3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908 not found: ID does not exist" containerID="3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.512780 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908"} err="failed to get container status \"3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908\": rpc error: code = NotFound desc = could not find container \"3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908\": container with ID starting with 3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908 not found: ID does not exist" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.512807 4778 scope.go:117] "RemoveContainer" containerID="7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb" Mar 18 10:53:13 crc kubenswrapper[4778]: E0318 10:53:13.513283 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb\": container with ID starting with 7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb not found: ID does not exist" containerID="7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.513308 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb"} err="failed to get container status \"7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb\": rpc error: code = NotFound desc = could not find container \"7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb\": container with ID starting with 7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb not found: ID does not exist" Mar 18 10:53:14 crc kubenswrapper[4778]: I0318 10:53:14.210739 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" path="/var/lib/kubelet/pods/c3bf4465-218c-43ec-84d3-9881b5d329ea/volumes" Mar 18 10:53:24 crc kubenswrapper[4778]: I0318 10:53:24.203311 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:53:24 crc kubenswrapper[4778]: E0318 10:53:24.204769 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:53:35 crc kubenswrapper[4778]: I0318 10:53:35.188104 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:53:35 crc kubenswrapper[4778]: E0318 10:53:35.189576 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:53:47 crc kubenswrapper[4778]: I0318 10:53:47.188671 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:53:47 crc kubenswrapper[4778]: E0318 10:53:47.189523 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:53:55 crc kubenswrapper[4778]: I0318 10:53:55.833632 4778 generic.go:334] "Generic (PLEG): container finished" podID="5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" containerID="ba5823dd6f9a4d25d340574d43a268718dced1548735d14a1482f6123dc8e01d" exitCode=0 Mar 18 10:53:55 crc kubenswrapper[4778]: I0318 10:53:55.834410 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557","Type":"ContainerDied","Data":"ba5823dd6f9a4d25d340574d43a268718dced1548735d14a1482f6123dc8e01d"} Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.307506 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397143 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Mar 18 10:53:57 crc kubenswrapper[4778]: E0318 10:53:57.397504 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" containerName="tobiko-tests-tobiko" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397525 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" containerName="tobiko-tests-tobiko" Mar 18 10:53:57 crc kubenswrapper[4778]: E0318 10:53:57.397558 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="registry-server" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397566 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="registry-server" Mar 18 10:53:57 crc kubenswrapper[4778]: E0318 10:53:57.397580 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="extract-utilities" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397588 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="extract-utilities" Mar 18 10:53:57 crc kubenswrapper[4778]: E0318 10:53:57.397604 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="extract-content" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397610 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="extract-content" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397780 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" containerName="tobiko-tests-tobiko" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397812 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="registry-server" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.398584 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.401146 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobiko-tobiko-config-1" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.402052 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobiko-tobiko-private-key-1" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.403014 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobiko-tobiko-public-key-1" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.412931 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489264 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-clouds-config\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489380 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kubeconfig\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489403 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ceph\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489436 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-workdir\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489575 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ca-certs\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489608 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-public-key\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489640 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489665 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-config\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489705 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-temporary\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489742 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-private-key\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489771 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxhtn\" (UniqueName: \"kubernetes.io/projected/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kube-api-access-dxhtn\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489872 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-openstack-config-secret\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.491034 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.494434 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ceph" (OuterVolumeSpecName: "ceph") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.494852 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kube-api-access-dxhtn" (OuterVolumeSpecName: "kube-api-access-dxhtn") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "kube-api-access-dxhtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.499264 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.515944 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-config" (OuterVolumeSpecName: "tobiko-config") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "tobiko-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.517893 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.524213 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-private-key" (OuterVolumeSpecName: "tobiko-private-key") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "tobiko-private-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.532445 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kubeconfig" (OuterVolumeSpecName: "kubeconfig") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "kubeconfig". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.542872 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-public-key" (OuterVolumeSpecName: "tobiko-public-key") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "tobiko-public-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.558988 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.574813 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592446 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592491 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592513 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592531 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592623 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592662 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592685 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592706 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592757 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592781 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592812 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmpg\" (UniqueName: \"kubernetes.io/projected/bd565818-8912-47ba-881f-f88011fa9b46-kube-api-access-swmpg\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592844 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592917 4778 reconciler_common.go:293] "Volume detached for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kubeconfig\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592993 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593031 4778 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593046 4778 reconciler_common.go:293] "Volume detached for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-public-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593057 4778 reconciler_common.go:293] "Volume detached for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593068 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593078 4778 reconciler_common.go:293] "Volume detached for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-private-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593088 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxhtn\" (UniqueName: \"kubernetes.io/projected/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kube-api-access-dxhtn\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593105 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593115 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.627080 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695009 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695354 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695481 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695524 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695578 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695624 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695642 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695676 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695705 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swmpg\" (UniqueName: \"kubernetes.io/projected/bd565818-8912-47ba-881f-f88011fa9b46-kube-api-access-swmpg\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695766 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.697060 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.697726 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.698301 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.698621 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.699008 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.699490 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.699876 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.700859 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.701081 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.712744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmpg\" (UniqueName: \"kubernetes.io/projected/bd565818-8912-47ba-881f-f88011fa9b46-kube-api-access-swmpg\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.723030 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.857176 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557","Type":"ContainerDied","Data":"aebcfed3451b3e3d8b172a76c2bb743ffd3f47051e3238bd0316471042774306"} Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.857598 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aebcfed3451b3e3d8b172a76c2bb743ffd3f47051e3238bd0316471042774306" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.857652 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:53:58 crc kubenswrapper[4778]: I0318 10:53:58.190408 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:53:58 crc kubenswrapper[4778]: E0318 10:53:58.190733 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:53:58 crc kubenswrapper[4778]: I0318 10:53:58.249525 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Mar 18 10:53:58 crc kubenswrapper[4778]: I0318 10:53:58.825248 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:53:58 crc kubenswrapper[4778]: I0318 10:53:58.826000 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:58 crc kubenswrapper[4778]: I0318 10:53:58.868737 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"bd565818-8912-47ba-881f-f88011fa9b46","Type":"ContainerStarted","Data":"9010a2462d1e9f7fe8f1670549c3a224043e894c401167ef3df32c9556413256"} Mar 18 10:53:59 crc kubenswrapper[4778]: I0318 10:53:59.880607 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"bd565818-8912-47ba-881f-f88011fa9b46","Type":"ContainerStarted","Data":"878a250da8e9fd68a2017bd74707da9dc4870b9273766f35be6449c7f483e262"} Mar 18 10:53:59 crc kubenswrapper[4778]: I0318 10:53:59.910767 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tobiko-tests-tobiko-s01-sanity" podStartSLOduration=2.910746937 podStartE2EDuration="2.910746937s" podCreationTimestamp="2026-03-18 10:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:53:59.901457066 +0000 UTC m=+6706.476201916" watchObservedRunningTime="2026-03-18 10:53:59.910746937 +0000 UTC m=+6706.485491787" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.132936 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563854-qqn9z"] Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.134465 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.138578 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.138750 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.139383 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.154679 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563854-qqn9z"] Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.257694 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqwq8\" (UniqueName: \"kubernetes.io/projected/f6103ea7-c41a-40d2-ae16-15f066c955b9-kube-api-access-zqwq8\") pod \"auto-csr-approver-29563854-qqn9z\" (UID: \"f6103ea7-c41a-40d2-ae16-15f066c955b9\") " pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.359829 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqwq8\" (UniqueName: \"kubernetes.io/projected/f6103ea7-c41a-40d2-ae16-15f066c955b9-kube-api-access-zqwq8\") pod \"auto-csr-approver-29563854-qqn9z\" (UID: \"f6103ea7-c41a-40d2-ae16-15f066c955b9\") " pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.393141 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqwq8\" (UniqueName: \"kubernetes.io/projected/f6103ea7-c41a-40d2-ae16-15f066c955b9-kube-api-access-zqwq8\") pod \"auto-csr-approver-29563854-qqn9z\" (UID: \"f6103ea7-c41a-40d2-ae16-15f066c955b9\") " pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.452715 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.955649 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563854-qqn9z"] Mar 18 10:54:00 crc kubenswrapper[4778]: W0318 10:54:00.960289 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6103ea7_c41a_40d2_ae16_15f066c955b9.slice/crio-785ee8cb881253f21a70135f3063962bfdc3def3aaa5ad76eb532fe55d73a987 WatchSource:0}: Error finding container 785ee8cb881253f21a70135f3063962bfdc3def3aaa5ad76eb532fe55d73a987: Status 404 returned error can't find the container with id 785ee8cb881253f21a70135f3063962bfdc3def3aaa5ad76eb532fe55d73a987 Mar 18 10:54:01 crc kubenswrapper[4778]: I0318 10:54:01.899177 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" event={"ID":"f6103ea7-c41a-40d2-ae16-15f066c955b9","Type":"ContainerStarted","Data":"785ee8cb881253f21a70135f3063962bfdc3def3aaa5ad76eb532fe55d73a987"} Mar 18 10:54:02 crc kubenswrapper[4778]: I0318 10:54:02.913436 4778 generic.go:334] "Generic (PLEG): container finished" podID="f6103ea7-c41a-40d2-ae16-15f066c955b9" containerID="20d0876a2852471421fd6830f32cec9b8955b6abdc480edeb7e2a46c81a72c97" exitCode=0 Mar 18 10:54:02 crc kubenswrapper[4778]: I0318 10:54:02.913514 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" event={"ID":"f6103ea7-c41a-40d2-ae16-15f066c955b9","Type":"ContainerDied","Data":"20d0876a2852471421fd6830f32cec9b8955b6abdc480edeb7e2a46c81a72c97"} Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.306869 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.447557 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqwq8\" (UniqueName: \"kubernetes.io/projected/f6103ea7-c41a-40d2-ae16-15f066c955b9-kube-api-access-zqwq8\") pod \"f6103ea7-c41a-40d2-ae16-15f066c955b9\" (UID: \"f6103ea7-c41a-40d2-ae16-15f066c955b9\") " Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.453740 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6103ea7-c41a-40d2-ae16-15f066c955b9-kube-api-access-zqwq8" (OuterVolumeSpecName: "kube-api-access-zqwq8") pod "f6103ea7-c41a-40d2-ae16-15f066c955b9" (UID: "f6103ea7-c41a-40d2-ae16-15f066c955b9"). InnerVolumeSpecName "kube-api-access-zqwq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.549823 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqwq8\" (UniqueName: \"kubernetes.io/projected/f6103ea7-c41a-40d2-ae16-15f066c955b9-kube-api-access-zqwq8\") on node \"crc\" DevicePath \"\"" Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.951483 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" event={"ID":"f6103ea7-c41a-40d2-ae16-15f066c955b9","Type":"ContainerDied","Data":"785ee8cb881253f21a70135f3063962bfdc3def3aaa5ad76eb532fe55d73a987"} Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.951546 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.951561 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="785ee8cb881253f21a70135f3063962bfdc3def3aaa5ad76eb532fe55d73a987" Mar 18 10:54:05 crc kubenswrapper[4778]: I0318 10:54:05.414034 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563848-f28cg"] Mar 18 10:54:05 crc kubenswrapper[4778]: I0318 10:54:05.421673 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563848-f28cg"] Mar 18 10:54:06 crc kubenswrapper[4778]: I0318 10:54:06.204262 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49327474-2bad-4ebc-b955-bf9dd1268c5e" path="/var/lib/kubelet/pods/49327474-2bad-4ebc-b955-bf9dd1268c5e/volumes" Mar 18 10:54:11 crc kubenswrapper[4778]: I0318 10:54:11.187916 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:54:11 crc kubenswrapper[4778]: E0318 10:54:11.188592 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:54:23 crc kubenswrapper[4778]: I0318 10:54:23.187365 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:54:23 crc kubenswrapper[4778]: E0318 10:54:23.188717 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:54:36 crc kubenswrapper[4778]: I0318 10:54:36.186748 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:54:36 crc kubenswrapper[4778]: E0318 10:54:36.187441 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:54:37 crc kubenswrapper[4778]: I0318 10:54:37.303776 4778 scope.go:117] "RemoveContainer" containerID="f8a6db8312e875033fdd1f73d7983e85a274f3fbde6864a2faaec123c194e5c8" Mar 18 10:54:47 crc kubenswrapper[4778]: I0318 10:54:47.187963 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:54:47 crc kubenswrapper[4778]: E0318 10:54:47.189212 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:55:01 crc kubenswrapper[4778]: I0318 10:55:01.187689 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:55:01 crc kubenswrapper[4778]: E0318 10:55:01.188763 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:55:16 crc kubenswrapper[4778]: I0318 10:55:16.188009 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:55:16 crc kubenswrapper[4778]: E0318 10:55:16.188923 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:55:19 crc kubenswrapper[4778]: I0318 10:55:19.758113 4778 generic.go:334] "Generic (PLEG): container finished" podID="bd565818-8912-47ba-881f-f88011fa9b46" containerID="878a250da8e9fd68a2017bd74707da9dc4870b9273766f35be6449c7f483e262" exitCode=0 Mar 18 10:55:19 crc kubenswrapper[4778]: I0318 10:55:19.758164 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"bd565818-8912-47ba-881f-f88011fa9b46","Type":"ContainerDied","Data":"878a250da8e9fd68a2017bd74707da9dc4870b9273766f35be6449c7f483e262"} Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.277354 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.478940 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479099 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swmpg\" (UniqueName: \"kubernetes.io/projected/bd565818-8912-47ba-881f-f88011fa9b46-kube-api-access-swmpg\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479160 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-config\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479250 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-workdir\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479303 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-kubeconfig\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479416 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-private-key\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479524 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ca-certs\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479682 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-clouds-config\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479848 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-temporary\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479934 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-openstack-config-secret\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479991 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-public-key\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.480037 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ceph\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.484260 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.485771 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.487361 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd565818-8912-47ba-881f-f88011fa9b46-kube-api-access-swmpg" (OuterVolumeSpecName: "kube-api-access-swmpg") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "kube-api-access-swmpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.489577 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ceph" (OuterVolumeSpecName: "ceph") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.513574 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-config" (OuterVolumeSpecName: "tobiko-config") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "tobiko-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.533293 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-public-key" (OuterVolumeSpecName: "tobiko-public-key") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "tobiko-public-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.538895 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-private-key" (OuterVolumeSpecName: "tobiko-private-key") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "tobiko-private-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.541930 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-kubeconfig" (OuterVolumeSpecName: "kubeconfig") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "kubeconfig". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.550444 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.551414 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584081 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584131 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swmpg\" (UniqueName: \"kubernetes.io/projected/bd565818-8912-47ba-881f-f88011fa9b46-kube-api-access-swmpg\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584150 4778 reconciler_common.go:293] "Volume detached for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584166 4778 reconciler_common.go:293] "Volume detached for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-kubeconfig\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584180 4778 reconciler_common.go:293] "Volume detached for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-private-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584194 4778 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584241 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584261 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584279 4778 reconciler_common.go:293] "Volume detached for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-public-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584296 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.591317 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.627275 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.687114 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.687193 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.780377 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"bd565818-8912-47ba-881f-f88011fa9b46","Type":"ContainerDied","Data":"9010a2462d1e9f7fe8f1670549c3a224043e894c401167ef3df32c9556413256"} Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.780432 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9010a2462d1e9f7fe8f1670549c3a224043e894c401167ef3df32c9556413256" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.780464 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:55:22 crc kubenswrapper[4778]: I0318 10:55:22.882012 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:55:22 crc kubenswrapper[4778]: I0318 10:55:22.914737 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.187805 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.743003 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Mar 18 10:55:31 crc kubenswrapper[4778]: E0318 10:55:31.744155 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6103ea7-c41a-40d2-ae16-15f066c955b9" containerName="oc" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.744168 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6103ea7-c41a-40d2-ae16-15f066c955b9" containerName="oc" Mar 18 10:55:31 crc kubenswrapper[4778]: E0318 10:55:31.744228 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd565818-8912-47ba-881f-f88011fa9b46" containerName="tobiko-tests-tobiko" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.744236 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd565818-8912-47ba-881f-f88011fa9b46" containerName="tobiko-tests-tobiko" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.744454 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6103ea7-c41a-40d2-ae16-15f066c955b9" containerName="oc" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.744468 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd565818-8912-47ba-881f-f88011fa9b46" containerName="tobiko-tests-tobiko" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.745266 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.756420 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.900015 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"3a30fa73502544b7ae8a345c4023efae179b70a66379eb3e45aa01314f418265"} Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.935032 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfch6\" (UniqueName: \"kubernetes.io/projected/4e028d5e-666c-497c-949e-97860410ad74-kube-api-access-pfch6\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.935111 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.037503 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfch6\" (UniqueName: \"kubernetes.io/projected/4e028d5e-666c-497c-949e-97860410ad74-kube-api-access-pfch6\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.037605 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.039259 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.069948 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfch6\" (UniqueName: \"kubernetes.io/projected/4e028d5e-666c-497c-949e-97860410ad74-kube-api-access-pfch6\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.077477 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.117389 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.602525 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.604989 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.915781 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" event={"ID":"4e028d5e-666c-497c-949e-97860410ad74","Type":"ContainerStarted","Data":"e74038fb5b8acc694c4975a889116cd0644a3950ac18ae2c472be442026ead89"} Mar 18 10:55:33 crc kubenswrapper[4778]: I0318 10:55:33.927512 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" event={"ID":"4e028d5e-666c-497c-949e-97860410ad74","Type":"ContainerStarted","Data":"a738e69a3512731c93e7c4933c55a03b89c1c13e6428ba10eec5561e907a7643"} Mar 18 10:55:33 crc kubenswrapper[4778]: I0318 10:55:33.952334 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" podStartSLOduration=2.514090416 podStartE2EDuration="2.952294575s" podCreationTimestamp="2026-03-18 10:55:31 +0000 UTC" firstStartedPulling="2026-03-18 10:55:32.60463772 +0000 UTC m=+6799.179382580" lastFinishedPulling="2026-03-18 10:55:33.042841879 +0000 UTC m=+6799.617586739" observedRunningTime="2026-03-18 10:55:33.946789176 +0000 UTC m=+6800.521534056" watchObservedRunningTime="2026-03-18 10:55:33.952294575 +0000 UTC m=+6800.527039505" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.657051 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ansibletest-ansibletest"] Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.663016 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.669636 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.669892 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.674051 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ansibletest-ansibletest"] Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773071 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773149 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773229 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773272 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j742h\" (UniqueName: \"kubernetes.io/projected/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-kube-api-access-j742h\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773309 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773416 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773444 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ceph\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773488 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773534 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875162 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875261 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875312 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875349 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875379 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875414 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j742h\" (UniqueName: \"kubernetes.io/projected/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-kube-api-access-j742h\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875486 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875509 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875535 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ceph\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.876153 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.876570 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.877379 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.877411 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.890344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.890419 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ceph\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.891059 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.896787 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.904879 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.907754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j742h\" (UniqueName: \"kubernetes.io/projected/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-kube-api-access-j742h\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.917701 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.980076 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Mar 18 10:55:46 crc kubenswrapper[4778]: I0318 10:55:46.457476 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ansibletest-ansibletest"] Mar 18 10:55:47 crc kubenswrapper[4778]: I0318 10:55:47.113931 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9","Type":"ContainerStarted","Data":"ec28f4abe6951258d35c7175f6d4f29db741687b55b7cbd44e65672620fd1045"} Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.186910 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5jcs8"] Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.190086 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.244336 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jcs8"] Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.338377 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-catalog-content\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.338507 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mmnl\" (UniqueName: \"kubernetes.io/projected/2b921080-6bfb-4a4d-b453-d5e2370a7558-kube-api-access-9mmnl\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.338614 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-utilities\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.440346 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-utilities\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.440517 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-catalog-content\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.440709 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mmnl\" (UniqueName: \"kubernetes.io/projected/2b921080-6bfb-4a4d-b453-d5e2370a7558-kube-api-access-9mmnl\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.440831 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-utilities\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.440959 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-catalog-content\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.464176 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mmnl\" (UniqueName: \"kubernetes.io/projected/2b921080-6bfb-4a4d-b453-d5e2370a7558-kube-api-access-9mmnl\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.529846 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.146760 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563856-kvmq4"] Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.148872 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.151220 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.151224 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.151766 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.159362 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563856-kvmq4"] Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.322214 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94vhq\" (UniqueName: \"kubernetes.io/projected/0b69a324-153a-4262-92ea-62c8b9d5928e-kube-api-access-94vhq\") pod \"auto-csr-approver-29563856-kvmq4\" (UID: \"0b69a324-153a-4262-92ea-62c8b9d5928e\") " pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.424075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94vhq\" (UniqueName: \"kubernetes.io/projected/0b69a324-153a-4262-92ea-62c8b9d5928e-kube-api-access-94vhq\") pod \"auto-csr-approver-29563856-kvmq4\" (UID: \"0b69a324-153a-4262-92ea-62c8b9d5928e\") " pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.446119 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94vhq\" (UniqueName: \"kubernetes.io/projected/0b69a324-153a-4262-92ea-62c8b9d5928e-kube-api-access-94vhq\") pod \"auto-csr-approver-29563856-kvmq4\" (UID: \"0b69a324-153a-4262-92ea-62c8b9d5928e\") " pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.475367 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:03 crc kubenswrapper[4778]: E0318 10:56:03.247811 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified" Mar 18 10:56:03 crc kubenswrapper[4778]: E0318 10:56:03.248225 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:56:03 crc kubenswrapper[4778]: container &Container{Name:ansibletest-ansibletest,Image:quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_ANSIBLE_EXTRA_VARS,Value:-e manual_run=false,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_FILE_EXTRA_VARS,Value:--- Mar 18 10:56:03 crc kubenswrapper[4778]: foo: bar Mar 18 10:56:03 crc kubenswrapper[4778]: ,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_GIT_BRANCH,Value:,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_GIT_REPO,Value:https://github.com/ansible/test-playbooks,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_INVENTORY,Value:localhost ansible_connection=local ansible_python_interpreter=python3 Mar 18 10:56:03 crc kubenswrapper[4778]: ,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_PLAYBOOK,Value:./debug.yml,ValueFrom:nil,},EnvVar{Name:POD_DEBUG,Value:false,ValueFrom:nil,},EnvVar{Name:POD_INSTALL_COLLECTIONS,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{4 0} {} 4 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Requests:ResourceList{cpu: {{2 0} {} 2 DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/ansible,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/AnsibleTests/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/ansible/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/var/lib/ansible/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:compute-ssh-secret,ReadOnly:true,MountPath:/var/lib/ansible/.ssh/compute_id,SubPath:ssh-privatekey,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:workload-ssh-secret,ReadOnly:true,MountPath:/var/lib/ansible/test_keypair.key,SubPath:ssh-privatekey,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j742h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*227,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*227,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ansibletest-ansibletest_openstack(1fb58f5e-1c8b-45e2-bf86-b81af58b66a9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 18 10:56:03 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 10:56:03 crc kubenswrapper[4778]: E0318 10:56:03.249659 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ansibletest-ansibletest\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ansibletest-ansibletest" podUID="1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" Mar 18 10:56:03 crc kubenswrapper[4778]: E0318 10:56:03.278612 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ansibletest-ansibletest\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified\\\"\"" pod="openstack/ansibletest-ansibletest" podUID="1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" Mar 18 10:56:03 crc kubenswrapper[4778]: I0318 10:56:03.750939 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jcs8"] Mar 18 10:56:03 crc kubenswrapper[4778]: I0318 10:56:03.775697 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563856-kvmq4"] Mar 18 10:56:04 crc kubenswrapper[4778]: I0318 10:56:04.284454 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" event={"ID":"0b69a324-153a-4262-92ea-62c8b9d5928e","Type":"ContainerStarted","Data":"1853e87f28a852e912f97a478758c1a4fbf38aebb6e3dd56004bc47fae74654b"} Mar 18 10:56:04 crc kubenswrapper[4778]: I0318 10:56:04.286362 4778 generic.go:334] "Generic (PLEG): container finished" podID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerID="92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef" exitCode=0 Mar 18 10:56:04 crc kubenswrapper[4778]: I0318 10:56:04.286393 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerDied","Data":"92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef"} Mar 18 10:56:04 crc kubenswrapper[4778]: I0318 10:56:04.286416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerStarted","Data":"cc671246021c63a76c31044da6da18f15794dcb492863055a35ffee65d602e4f"} Mar 18 10:56:05 crc kubenswrapper[4778]: I0318 10:56:05.296920 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerStarted","Data":"eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899"} Mar 18 10:56:05 crc kubenswrapper[4778]: I0318 10:56:05.298886 4778 generic.go:334] "Generic (PLEG): container finished" podID="0b69a324-153a-4262-92ea-62c8b9d5928e" containerID="82c47033c6d17fb0d1f1f077c5ae48584be4ec251f8c624e7bed8591ae05dffd" exitCode=0 Mar 18 10:56:05 crc kubenswrapper[4778]: I0318 10:56:05.298925 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" event={"ID":"0b69a324-153a-4262-92ea-62c8b9d5928e","Type":"ContainerDied","Data":"82c47033c6d17fb0d1f1f077c5ae48584be4ec251f8c624e7bed8591ae05dffd"} Mar 18 10:56:06 crc kubenswrapper[4778]: I0318 10:56:06.309842 4778 generic.go:334] "Generic (PLEG): container finished" podID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerID="eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899" exitCode=0 Mar 18 10:56:06 crc kubenswrapper[4778]: I0318 10:56:06.309895 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerDied","Data":"eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899"} Mar 18 10:56:06 crc kubenswrapper[4778]: I0318 10:56:06.694954 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:06 crc kubenswrapper[4778]: I0318 10:56:06.767925 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94vhq\" (UniqueName: \"kubernetes.io/projected/0b69a324-153a-4262-92ea-62c8b9d5928e-kube-api-access-94vhq\") pod \"0b69a324-153a-4262-92ea-62c8b9d5928e\" (UID: \"0b69a324-153a-4262-92ea-62c8b9d5928e\") " Mar 18 10:56:06 crc kubenswrapper[4778]: I0318 10:56:06.778738 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b69a324-153a-4262-92ea-62c8b9d5928e-kube-api-access-94vhq" (OuterVolumeSpecName: "kube-api-access-94vhq") pod "0b69a324-153a-4262-92ea-62c8b9d5928e" (UID: "0b69a324-153a-4262-92ea-62c8b9d5928e"). InnerVolumeSpecName "kube-api-access-94vhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:56:06 crc kubenswrapper[4778]: I0318 10:56:06.871614 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94vhq\" (UniqueName: \"kubernetes.io/projected/0b69a324-153a-4262-92ea-62c8b9d5928e-kube-api-access-94vhq\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:07 crc kubenswrapper[4778]: I0318 10:56:07.320243 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" event={"ID":"0b69a324-153a-4262-92ea-62c8b9d5928e","Type":"ContainerDied","Data":"1853e87f28a852e912f97a478758c1a4fbf38aebb6e3dd56004bc47fae74654b"} Mar 18 10:56:07 crc kubenswrapper[4778]: I0318 10:56:07.320281 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1853e87f28a852e912f97a478758c1a4fbf38aebb6e3dd56004bc47fae74654b" Mar 18 10:56:07 crc kubenswrapper[4778]: I0318 10:56:07.320329 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:07 crc kubenswrapper[4778]: I0318 10:56:07.770746 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563850-jrcnr"] Mar 18 10:56:07 crc kubenswrapper[4778]: I0318 10:56:07.779419 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563850-jrcnr"] Mar 18 10:56:08 crc kubenswrapper[4778]: I0318 10:56:08.197046 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc499d31-e373-413b-8a38-1fa69f007f2f" path="/var/lib/kubelet/pods/fc499d31-e373-413b-8a38-1fa69f007f2f/volumes" Mar 18 10:56:09 crc kubenswrapper[4778]: I0318 10:56:09.337582 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerStarted","Data":"855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494"} Mar 18 10:56:09 crc kubenswrapper[4778]: I0318 10:56:09.356089 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5jcs8" podStartSLOduration=13.229774904 podStartE2EDuration="17.356067792s" podCreationTimestamp="2026-03-18 10:55:52 +0000 UTC" firstStartedPulling="2026-03-18 10:56:04.288515405 +0000 UTC m=+6830.863260255" lastFinishedPulling="2026-03-18 10:56:08.414808303 +0000 UTC m=+6834.989553143" observedRunningTime="2026-03-18 10:56:09.353691577 +0000 UTC m=+6835.928436427" watchObservedRunningTime="2026-03-18 10:56:09.356067792 +0000 UTC m=+6835.930812632" Mar 18 10:56:12 crc kubenswrapper[4778]: I0318 10:56:12.531651 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:12 crc kubenswrapper[4778]: I0318 10:56:12.531979 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:12 crc kubenswrapper[4778]: I0318 10:56:12.607217 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:13 crc kubenswrapper[4778]: I0318 10:56:13.433433 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:13 crc kubenswrapper[4778]: I0318 10:56:13.480577 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jcs8"] Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.389166 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5jcs8" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="registry-server" containerID="cri-o://855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494" gracePeriod=2 Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.879104 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.979173 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mmnl\" (UniqueName: \"kubernetes.io/projected/2b921080-6bfb-4a4d-b453-d5e2370a7558-kube-api-access-9mmnl\") pod \"2b921080-6bfb-4a4d-b453-d5e2370a7558\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.979404 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-catalog-content\") pod \"2b921080-6bfb-4a4d-b453-d5e2370a7558\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.979444 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-utilities\") pod \"2b921080-6bfb-4a4d-b453-d5e2370a7558\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.980813 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-utilities" (OuterVolumeSpecName: "utilities") pod "2b921080-6bfb-4a4d-b453-d5e2370a7558" (UID: "2b921080-6bfb-4a4d-b453-d5e2370a7558"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.989173 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b921080-6bfb-4a4d-b453-d5e2370a7558-kube-api-access-9mmnl" (OuterVolumeSpecName: "kube-api-access-9mmnl") pod "2b921080-6bfb-4a4d-b453-d5e2370a7558" (UID: "2b921080-6bfb-4a4d-b453-d5e2370a7558"). InnerVolumeSpecName "kube-api-access-9mmnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.011460 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b921080-6bfb-4a4d-b453-d5e2370a7558" (UID: "2b921080-6bfb-4a4d-b453-d5e2370a7558"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.082813 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mmnl\" (UniqueName: \"kubernetes.io/projected/2b921080-6bfb-4a4d-b453-d5e2370a7558-kube-api-access-9mmnl\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.082855 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.082870 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.403504 4778 generic.go:334] "Generic (PLEG): container finished" podID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerID="855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494" exitCode=0 Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.403577 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.403605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerDied","Data":"855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494"} Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.403954 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerDied","Data":"cc671246021c63a76c31044da6da18f15794dcb492863055a35ffee65d602e4f"} Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.403984 4778 scope.go:117] "RemoveContainer" containerID="855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.433112 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jcs8"] Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.438168 4778 scope.go:117] "RemoveContainer" containerID="eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.443756 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jcs8"] Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.463277 4778 scope.go:117] "RemoveContainer" containerID="92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.527794 4778 scope.go:117] "RemoveContainer" containerID="855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494" Mar 18 10:56:16 crc kubenswrapper[4778]: E0318 10:56:16.528294 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494\": container with ID starting with 855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494 not found: ID does not exist" containerID="855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.528328 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494"} err="failed to get container status \"855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494\": rpc error: code = NotFound desc = could not find container \"855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494\": container with ID starting with 855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494 not found: ID does not exist" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.528351 4778 scope.go:117] "RemoveContainer" containerID="eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899" Mar 18 10:56:16 crc kubenswrapper[4778]: E0318 10:56:16.528679 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899\": container with ID starting with eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899 not found: ID does not exist" containerID="eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.528703 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899"} err="failed to get container status \"eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899\": rpc error: code = NotFound desc = could not find container \"eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899\": container with ID starting with eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899 not found: ID does not exist" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.528717 4778 scope.go:117] "RemoveContainer" containerID="92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef" Mar 18 10:56:16 crc kubenswrapper[4778]: E0318 10:56:16.529034 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef\": container with ID starting with 92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef not found: ID does not exist" containerID="92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.529061 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef"} err="failed to get container status \"92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef\": rpc error: code = NotFound desc = could not find container \"92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef\": container with ID starting with 92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef not found: ID does not exist" Mar 18 10:56:18 crc kubenswrapper[4778]: I0318 10:56:18.197804 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" path="/var/lib/kubelet/pods/2b921080-6bfb-4a4d-b453-d5e2370a7558/volumes" Mar 18 10:56:18 crc kubenswrapper[4778]: I0318 10:56:18.427698 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9","Type":"ContainerStarted","Data":"16b1f9a1400b5530a46aaeb11db97cc9f9066213e3702e8bb6c8ab6c4b6e6715"} Mar 18 10:56:18 crc kubenswrapper[4778]: I0318 10:56:18.452344 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ansibletest-ansibletest" podStartSLOduration=4.028479033 podStartE2EDuration="34.45231919s" podCreationTimestamp="2026-03-18 10:55:44 +0000 UTC" firstStartedPulling="2026-03-18 10:55:46.462265297 +0000 UTC m=+6813.037010137" lastFinishedPulling="2026-03-18 10:56:16.886105454 +0000 UTC m=+6843.460850294" observedRunningTime="2026-03-18 10:56:18.443888781 +0000 UTC m=+6845.018633631" watchObservedRunningTime="2026-03-18 10:56:18.45231919 +0000 UTC m=+6845.027064030" Mar 18 10:56:19 crc kubenswrapper[4778]: I0318 10:56:19.444082 4778 generic.go:334] "Generic (PLEG): container finished" podID="1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" containerID="16b1f9a1400b5530a46aaeb11db97cc9f9066213e3702e8bb6c8ab6c4b6e6715" exitCode=0 Mar 18 10:56:19 crc kubenswrapper[4778]: I0318 10:56:19.444187 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9","Type":"ContainerDied","Data":"16b1f9a1400b5530a46aaeb11db97cc9f9066213e3702e8bb6c8ab6c4b6e6715"} Mar 18 10:56:20 crc kubenswrapper[4778]: I0318 10:56:20.915736 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096456 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j742h\" (UniqueName: \"kubernetes.io/projected/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-kube-api-access-j742h\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096541 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-workload-ssh-secret\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096569 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ceph\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096618 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ca-certs\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096670 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config-secret\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096707 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-compute-ssh-secret\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096805 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-temporary\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096847 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096877 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096905 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-workdir\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.098076 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.112727 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.113715 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ceph" (OuterVolumeSpecName: "ceph") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.118924 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.125354 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-kube-api-access-j742h" (OuterVolumeSpecName: "kube-api-access-j742h") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "kube-api-access-j742h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.133914 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.141388 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-compute-ssh-secret" (OuterVolumeSpecName: "compute-ssh-secret") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "compute-ssh-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.162612 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.162992 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-workload-ssh-secret" (OuterVolumeSpecName: "workload-ssh-secret") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "workload-ssh-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.172892 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.199181 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.199530 4778 reconciler_common.go:293] "Volume detached for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-compute-ssh-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.199638 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.199778 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.199880 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.199974 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.200055 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j742h\" (UniqueName: \"kubernetes.io/projected/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-kube-api-access-j742h\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.200150 4778 reconciler_common.go:293] "Volume detached for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-workload-ssh-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.200278 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.200369 4778 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.219946 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.302798 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.464094 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9","Type":"ContainerDied","Data":"ec28f4abe6951258d35c7175f6d4f29db741687b55b7cbd44e65672620fd1045"} Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.464137 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec28f4abe6951258d35c7175f6d4f29db741687b55b7cbd44e65672620fd1045" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.464481 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.011218 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Mar 18 10:56:30 crc kubenswrapper[4778]: E0318 10:56:30.012409 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="extract-content" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012433 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="extract-content" Mar 18 10:56:30 crc kubenswrapper[4778]: E0318 10:56:30.012462 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="extract-utilities" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012474 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="extract-utilities" Mar 18 10:56:30 crc kubenswrapper[4778]: E0318 10:56:30.012507 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="registry-server" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012517 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="registry-server" Mar 18 10:56:30 crc kubenswrapper[4778]: E0318 10:56:30.012539 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b69a324-153a-4262-92ea-62c8b9d5928e" containerName="oc" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012550 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b69a324-153a-4262-92ea-62c8b9d5928e" containerName="oc" Mar 18 10:56:30 crc kubenswrapper[4778]: E0318 10:56:30.012569 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" containerName="ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012582 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" containerName="ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012902 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b69a324-153a-4262-92ea-62c8b9d5928e" containerName="oc" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012933 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" containerName="ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012953 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="registry-server" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.014023 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.023215 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.203073 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mppjz\" (UniqueName: \"kubernetes.io/projected/1f57757d-6483-4e1a-9a09-e63026f73e70-kube-api-access-mppjz\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.203126 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.304722 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mppjz\" (UniqueName: \"kubernetes.io/projected/1f57757d-6483-4e1a-9a09-e63026f73e70-kube-api-access-mppjz\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.304784 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.305367 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.327994 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mppjz\" (UniqueName: \"kubernetes.io/projected/1f57757d-6483-4e1a-9a09-e63026f73e70-kube-api-access-mppjz\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.330515 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.361148 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.854772 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Mar 18 10:56:31 crc kubenswrapper[4778]: I0318 10:56:31.559507 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" event={"ID":"1f57757d-6483-4e1a-9a09-e63026f73e70","Type":"ContainerStarted","Data":"ec2d906ea27da463130b202e3b2a5fb6e590d46d02cf356b02ad075d9bf32c7c"} Mar 18 10:56:32 crc kubenswrapper[4778]: I0318 10:56:32.571667 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" event={"ID":"1f57757d-6483-4e1a-9a09-e63026f73e70","Type":"ContainerStarted","Data":"10186be56feafc053b6915c5ba5f3e6d045f382c5c6e998ae7af8cea176468ea"} Mar 18 10:56:32 crc kubenswrapper[4778]: I0318 10:56:32.595251 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" podStartSLOduration=3.059411259 podStartE2EDuration="3.595228787s" podCreationTimestamp="2026-03-18 10:56:29 +0000 UTC" firstStartedPulling="2026-03-18 10:56:30.848164955 +0000 UTC m=+6857.422909835" lastFinishedPulling="2026-03-18 10:56:31.383982503 +0000 UTC m=+6857.958727363" observedRunningTime="2026-03-18 10:56:32.592722449 +0000 UTC m=+6859.167467309" watchObservedRunningTime="2026-03-18 10:56:32.595228787 +0000 UTC m=+6859.169973637" Mar 18 10:56:37 crc kubenswrapper[4778]: I0318 10:56:37.415391 4778 scope.go:117] "RemoveContainer" containerID="f14aa13ab7520a46eb0a2d95277ca80a41dbb7bc18bd9145e42c1b01a855cabf" Mar 18 10:56:43 crc kubenswrapper[4778]: I0318 10:56:43.844350 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizontest-tests-horizontest"] Mar 18 10:56:43 crc kubenswrapper[4778]: I0318 10:56:43.846692 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:43 crc kubenswrapper[4778]: I0318 10:56:43.848976 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizontest-tests-horizontesthorizontest-config" Mar 18 10:56:43 crc kubenswrapper[4778]: I0318 10:56:43.849601 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"test-operator-clouds-config" Mar 18 10:56:43 crc kubenswrapper[4778]: I0318 10:56:43.861744 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizontest-tests-horizontest"] Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.023694 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.023756 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.023816 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.023880 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.023915 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.023944 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8grn\" (UniqueName: \"kubernetes.io/projected/49ff1200-d42e-4022-990d-619169f357f4-kube-api-access-f8grn\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.024012 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/49ff1200-d42e-4022-990d-619169f357f4-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.024035 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.125837 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/49ff1200-d42e-4022-990d-619169f357f4-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.125889 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.125923 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.125957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.125986 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.126045 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.126092 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.126122 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8grn\" (UniqueName: \"kubernetes.io/projected/49ff1200-d42e-4022-990d-619169f357f4-kube-api-access-f8grn\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.126767 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.127476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.127735 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.128466 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/49ff1200-d42e-4022-990d-619169f357f4-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.135155 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.135608 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.147499 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.157706 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8grn\" (UniqueName: \"kubernetes.io/projected/49ff1200-d42e-4022-990d-619169f357f4-kube-api-access-f8grn\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.165985 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.174841 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: W0318 10:56:44.708828 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49ff1200_d42e_4022_990d_619169f357f4.slice/crio-4f8bd5ec162b5a7ad044f9435fa7f84993d2997926262260a373bb92e1fa8a81 WatchSource:0}: Error finding container 4f8bd5ec162b5a7ad044f9435fa7f84993d2997926262260a373bb92e1fa8a81: Status 404 returned error can't find the container with id 4f8bd5ec162b5a7ad044f9435fa7f84993d2997926262260a373bb92e1fa8a81 Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.708894 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizontest-tests-horizontest"] Mar 18 10:56:45 crc kubenswrapper[4778]: I0318 10:56:45.702439 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"49ff1200-d42e-4022-990d-619169f357f4","Type":"ContainerStarted","Data":"4f8bd5ec162b5a7ad044f9435fa7f84993d2997926262260a373bb92e1fa8a81"} Mar 18 10:57:02 crc kubenswrapper[4778]: E0318 10:57:02.708027 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizontest:current-podified" Mar 18 10:57:02 crc kubenswrapper[4778]: E0318 10:57:02.708835 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizontest-tests-horizontest,Image:quay.io/podified-antelope-centos9/openstack-horizontest:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADMIN_PASSWORD,Value:12345678,ValueFrom:nil,},EnvVar{Name:ADMIN_USERNAME,Value:admin,ValueFrom:nil,},EnvVar{Name:AUTH_URL,Value:https://keystone-public-openstack.apps-crc.testing,ValueFrom:nil,},EnvVar{Name:DASHBOARD_URL,Value:https://horizon-openstack.apps-crc.testing/,ValueFrom:nil,},EnvVar{Name:EXTRA_FLAG,Value:not pagination and test_users.py,ValueFrom:nil,},EnvVar{Name:FLAVOR_NAME,Value:m1.tiny,ValueFrom:nil,},EnvVar{Name:HORIZONTEST_DEBUG_MODE,Value:false,ValueFrom:nil,},EnvVar{Name:HORIZON_KEYS_FOLDER,Value:/etc/test_operator,ValueFrom:nil,},EnvVar{Name:HORIZON_LOGS_DIR_NAME,Value:horizon,ValueFrom:nil,},EnvVar{Name:HORIZON_REPO_BRANCH,Value:master,ValueFrom:nil,},EnvVar{Name:IMAGE_FILE,Value:/var/lib/horizontest/cirros-0.6.2-x86_64-disk.img,ValueFrom:nil,},EnvVar{Name:IMAGE_FILE_NAME,Value:cirros-0.6.2-x86_64-disk,ValueFrom:nil,},EnvVar{Name:IMAGE_URL,Value:http://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img,ValueFrom:nil,},EnvVar{Name:PASSWORD,Value:horizontest,ValueFrom:nil,},EnvVar{Name:PROJECT_NAME,Value:horizontest,ValueFrom:nil,},EnvVar{Name:PROJECT_NAME_XPATH,Value://*[@class=\"context-project\"]//ancestor::ul,ValueFrom:nil,},EnvVar{Name:REPO_URL,Value:https://review.opendev.org/openstack/horizon,ValueFrom:nil,},EnvVar{Name:USER_NAME,Value:horizontest,ValueFrom:nil,},EnvVar{Name:USE_EXTERNAL_FILES,Value:True,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{2 0} {} 2 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Requests:ResourceList{cpu: {{1 0} {} 1 DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/horizontest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/horizontest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/var/lib/horizontest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f8grn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42455,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42455,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizontest-tests-horizontest_openstack(49ff1200-d42e-4022-990d-619169f357f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 10:57:02 crc kubenswrapper[4778]: E0318 10:57:02.710140 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizontest-tests-horizontest\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizontest-tests-horizontest" podUID="49ff1200-d42e-4022-990d-619169f357f4" Mar 18 10:57:02 crc kubenswrapper[4778]: E0318 10:57:02.932993 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizontest-tests-horizontest\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizontest:current-podified\\\"\"" pod="openstack/horizontest-tests-horizontest" podUID="49ff1200-d42e-4022-990d-619169f357f4" Mar 18 10:57:18 crc kubenswrapper[4778]: I0318 10:57:18.117796 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"49ff1200-d42e-4022-990d-619169f357f4","Type":"ContainerStarted","Data":"9bb7e83c5b0c33f61c11e78cebbab0ce419ef90ac66a563b0301647b017512a0"} Mar 18 10:57:18 crc kubenswrapper[4778]: I0318 10:57:18.153303 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizontest-tests-horizontest" podStartSLOduration=3.822479441 podStartE2EDuration="36.153279737s" podCreationTimestamp="2026-03-18 10:56:42 +0000 UTC" firstStartedPulling="2026-03-18 10:56:44.711167198 +0000 UTC m=+6871.285912038" lastFinishedPulling="2026-03-18 10:57:17.041967494 +0000 UTC m=+6903.616712334" observedRunningTime="2026-03-18 10:57:18.150745768 +0000 UTC m=+6904.725490638" watchObservedRunningTime="2026-03-18 10:57:18.153279737 +0000 UTC m=+6904.728024627" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.141029 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563858-mb4zj"] Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.143060 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.145711 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.145789 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.147105 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.147167 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.158779 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.162684 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563858-mb4zj"] Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.266147 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqq6j\" (UniqueName: \"kubernetes.io/projected/9e4f7f22-f4dd-4291-b26b-1a54380c3851-kube-api-access-rqq6j\") pod \"auto-csr-approver-29563858-mb4zj\" (UID: \"9e4f7f22-f4dd-4291-b26b-1a54380c3851\") " pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.368664 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqq6j\" (UniqueName: \"kubernetes.io/projected/9e4f7f22-f4dd-4291-b26b-1a54380c3851-kube-api-access-rqq6j\") pod \"auto-csr-approver-29563858-mb4zj\" (UID: \"9e4f7f22-f4dd-4291-b26b-1a54380c3851\") " pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.387986 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqq6j\" (UniqueName: \"kubernetes.io/projected/9e4f7f22-f4dd-4291-b26b-1a54380c3851-kube-api-access-rqq6j\") pod \"auto-csr-approver-29563858-mb4zj\" (UID: \"9e4f7f22-f4dd-4291-b26b-1a54380c3851\") " pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.477270 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.945829 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563858-mb4zj"] Mar 18 10:58:01 crc kubenswrapper[4778]: I0318 10:58:01.585754 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" event={"ID":"9e4f7f22-f4dd-4291-b26b-1a54380c3851","Type":"ContainerStarted","Data":"b0a8368bf55253e44d9be8d25a50cd562e2598363e672850c49a53f03d5d483c"} Mar 18 10:58:02 crc kubenswrapper[4778]: I0318 10:58:02.594321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" event={"ID":"9e4f7f22-f4dd-4291-b26b-1a54380c3851","Type":"ContainerStarted","Data":"301c31c55dd167d2c6c06a6c3d13b7a706f6ed65cd7e2a490dde753952b7fad3"} Mar 18 10:58:03 crc kubenswrapper[4778]: I0318 10:58:03.605305 4778 generic.go:334] "Generic (PLEG): container finished" podID="9e4f7f22-f4dd-4291-b26b-1a54380c3851" containerID="301c31c55dd167d2c6c06a6c3d13b7a706f6ed65cd7e2a490dde753952b7fad3" exitCode=0 Mar 18 10:58:03 crc kubenswrapper[4778]: I0318 10:58:03.605364 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" event={"ID":"9e4f7f22-f4dd-4291-b26b-1a54380c3851","Type":"ContainerDied","Data":"301c31c55dd167d2c6c06a6c3d13b7a706f6ed65cd7e2a490dde753952b7fad3"} Mar 18 10:58:04 crc kubenswrapper[4778]: I0318 10:58:04.962456 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.068014 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqq6j\" (UniqueName: \"kubernetes.io/projected/9e4f7f22-f4dd-4291-b26b-1a54380c3851-kube-api-access-rqq6j\") pod \"9e4f7f22-f4dd-4291-b26b-1a54380c3851\" (UID: \"9e4f7f22-f4dd-4291-b26b-1a54380c3851\") " Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.074499 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e4f7f22-f4dd-4291-b26b-1a54380c3851-kube-api-access-rqq6j" (OuterVolumeSpecName: "kube-api-access-rqq6j") pod "9e4f7f22-f4dd-4291-b26b-1a54380c3851" (UID: "9e4f7f22-f4dd-4291-b26b-1a54380c3851"). InnerVolumeSpecName "kube-api-access-rqq6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.171096 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqq6j\" (UniqueName: \"kubernetes.io/projected/9e4f7f22-f4dd-4291-b26b-1a54380c3851-kube-api-access-rqq6j\") on node \"crc\" DevicePath \"\"" Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.627043 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" event={"ID":"9e4f7f22-f4dd-4291-b26b-1a54380c3851","Type":"ContainerDied","Data":"b0a8368bf55253e44d9be8d25a50cd562e2598363e672850c49a53f03d5d483c"} Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.627104 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0a8368bf55253e44d9be8d25a50cd562e2598363e672850c49a53f03d5d483c" Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.627159 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.700465 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563852-h8wf4"] Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.712918 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563852-h8wf4"] Mar 18 10:58:06 crc kubenswrapper[4778]: I0318 10:58:06.198220 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902826c1-406d-4d16-8655-4a85ff4a3205" path="/var/lib/kubelet/pods/902826c1-406d-4d16-8655-4a85ff4a3205/volumes" Mar 18 10:58:30 crc kubenswrapper[4778]: I0318 10:58:30.147946 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:58:30 crc kubenswrapper[4778]: I0318 10:58:30.149797 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:58:37 crc kubenswrapper[4778]: I0318 10:58:37.565825 4778 scope.go:117] "RemoveContainer" containerID="96707a6d26a4e59376b5ccb6d995399c8158c2cfc047ca02c91e8c3ceb00d6d6" Mar 18 10:59:00 crc kubenswrapper[4778]: I0318 10:59:00.147564 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:59:00 crc kubenswrapper[4778]: I0318 10:59:00.148518 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:59:00 crc kubenswrapper[4778]: I0318 10:59:00.148583 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:59:00 crc kubenswrapper[4778]: I0318 10:59:00.149724 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a30fa73502544b7ae8a345c4023efae179b70a66379eb3e45aa01314f418265"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:59:00 crc kubenswrapper[4778]: I0318 10:59:00.149823 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://3a30fa73502544b7ae8a345c4023efae179b70a66379eb3e45aa01314f418265" gracePeriod=600 Mar 18 10:59:01 crc kubenswrapper[4778]: I0318 10:59:01.252477 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="3a30fa73502544b7ae8a345c4023efae179b70a66379eb3e45aa01314f418265" exitCode=0 Mar 18 10:59:01 crc kubenswrapper[4778]: I0318 10:59:01.252517 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"3a30fa73502544b7ae8a345c4023efae179b70a66379eb3e45aa01314f418265"} Mar 18 10:59:01 crc kubenswrapper[4778]: I0318 10:59:01.253297 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9"} Mar 18 10:59:01 crc kubenswrapper[4778]: I0318 10:59:01.253342 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:59:13 crc kubenswrapper[4778]: I0318 10:59:13.387117 4778 generic.go:334] "Generic (PLEG): container finished" podID="49ff1200-d42e-4022-990d-619169f357f4" containerID="9bb7e83c5b0c33f61c11e78cebbab0ce419ef90ac66a563b0301647b017512a0" exitCode=0 Mar 18 10:59:13 crc kubenswrapper[4778]: I0318 10:59:13.387247 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"49ff1200-d42e-4022-990d-619169f357f4","Type":"ContainerDied","Data":"9bb7e83c5b0c33f61c11e78cebbab0ce419ef90ac66a563b0301647b017512a0"} Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.802775 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.899939 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8grn\" (UniqueName: \"kubernetes.io/projected/49ff1200-d42e-4022-990d-619169f357f4-kube-api-access-f8grn\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900073 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/49ff1200-d42e-4022-990d-619169f357f4-test-operator-clouds-config\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900230 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ceph\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900281 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-temporary\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900633 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900679 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ca-certs\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900783 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-openstack-config-secret\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900877 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-workdir\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.901598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.907610 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.908745 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ceph" (OuterVolumeSpecName: "ceph") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.910444 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ff1200-d42e-4022-990d-619169f357f4-kube-api-access-f8grn" (OuterVolumeSpecName: "kube-api-access-f8grn") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "kube-api-access-f8grn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.945051 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.961328 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.995764 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ff1200-d42e-4022-990d-619169f357f4-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006475 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006512 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8grn\" (UniqueName: \"kubernetes.io/projected/49ff1200-d42e-4022-990d-619169f357f4-kube-api-access-f8grn\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006525 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/49ff1200-d42e-4022-990d-619169f357f4-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006535 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006545 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006579 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006592 4778 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.033396 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.109260 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.155469 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.211350 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.415516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"49ff1200-d42e-4022-990d-619169f357f4","Type":"ContainerDied","Data":"4f8bd5ec162b5a7ad044f9435fa7f84993d2997926262260a373bb92e1fa8a81"} Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.415747 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f8bd5ec162b5a7ad044f9435fa7f84993d2997926262260a373bb92e1fa8a81" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.415897 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.789337 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h6pkp"] Mar 18 10:59:15 crc kubenswrapper[4778]: E0318 10:59:15.789932 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ff1200-d42e-4022-990d-619169f357f4" containerName="horizontest-tests-horizontest" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.789959 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ff1200-d42e-4022-990d-619169f357f4" containerName="horizontest-tests-horizontest" Mar 18 10:59:15 crc kubenswrapper[4778]: E0318 10:59:15.789992 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e4f7f22-f4dd-4291-b26b-1a54380c3851" containerName="oc" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.790002 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e4f7f22-f4dd-4291-b26b-1a54380c3851" containerName="oc" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.790351 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ff1200-d42e-4022-990d-619169f357f4" containerName="horizontest-tests-horizontest" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.790373 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e4f7f22-f4dd-4291-b26b-1a54380c3851" containerName="oc" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.792669 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.810986 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h6pkp"] Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.932305 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-catalog-content\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.932536 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m8r2\" (UniqueName: \"kubernetes.io/projected/752de958-6cfc-4ceb-84c4-006b0719f0a5-kube-api-access-7m8r2\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.932651 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-utilities\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.035076 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m8r2\" (UniqueName: \"kubernetes.io/projected/752de958-6cfc-4ceb-84c4-006b0719f0a5-kube-api-access-7m8r2\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.035142 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-utilities\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.035264 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-catalog-content\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.035762 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-catalog-content\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.035932 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-utilities\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.054063 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m8r2\" (UniqueName: \"kubernetes.io/projected/752de958-6cfc-4ceb-84c4-006b0719f0a5-kube-api-access-7m8r2\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.117044 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.585604 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h6pkp"] Mar 18 10:59:17 crc kubenswrapper[4778]: I0318 10:59:17.434171 4778 generic.go:334] "Generic (PLEG): container finished" podID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerID="8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5" exitCode=0 Mar 18 10:59:17 crc kubenswrapper[4778]: I0318 10:59:17.434229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerDied","Data":"8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5"} Mar 18 10:59:17 crc kubenswrapper[4778]: I0318 10:59:17.434538 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerStarted","Data":"1a5621a90cc4698a3bf1daf5ec403ed97712aae84421056cd21a797fac8ef3b4"} Mar 18 10:59:19 crc kubenswrapper[4778]: I0318 10:59:19.457929 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerStarted","Data":"f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e"} Mar 18 10:59:21 crc kubenswrapper[4778]: I0318 10:59:21.479113 4778 generic.go:334] "Generic (PLEG): container finished" podID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerID="f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e" exitCode=0 Mar 18 10:59:21 crc kubenswrapper[4778]: I0318 10:59:21.479175 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerDied","Data":"f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e"} Mar 18 10:59:22 crc kubenswrapper[4778]: I0318 10:59:22.493108 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerStarted","Data":"9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd"} Mar 18 10:59:22 crc kubenswrapper[4778]: I0318 10:59:22.520679 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h6pkp" podStartSLOduration=2.755921678 podStartE2EDuration="7.520658249s" podCreationTimestamp="2026-03-18 10:59:15 +0000 UTC" firstStartedPulling="2026-03-18 10:59:17.436051339 +0000 UTC m=+7024.010796179" lastFinishedPulling="2026-03-18 10:59:22.2007879 +0000 UTC m=+7028.775532750" observedRunningTime="2026-03-18 10:59:22.512519238 +0000 UTC m=+7029.087264088" watchObservedRunningTime="2026-03-18 10:59:22.520658249 +0000 UTC m=+7029.095403089" Mar 18 10:59:25 crc kubenswrapper[4778]: I0318 10:59:25.864857 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Mar 18 10:59:25 crc kubenswrapper[4778]: I0318 10:59:25.866649 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:25 crc kubenswrapper[4778]: I0318 10:59:25.888266 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Mar 18 10:59:25 crc kubenswrapper[4778]: I0318 10:59:25.936743 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:25 crc kubenswrapper[4778]: I0318 10:59:25.936827 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9r69\" (UniqueName: \"kubernetes.io/projected/3db5e33d-384f-4df3-bfb8-ba279b83f7e4-kube-api-access-w9r69\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.038527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9r69\" (UniqueName: \"kubernetes.io/projected/3db5e33d-384f-4df3-bfb8-ba279b83f7e4-kube-api-access-w9r69\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.038726 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.039164 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.057433 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9r69\" (UniqueName: \"kubernetes.io/projected/3db5e33d-384f-4df3-bfb8-ba279b83f7e4-kube-api-access-w9r69\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.069467 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.121448 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.121496 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.211545 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: E0318 10:59:26.212004 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.681260 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Mar 18 10:59:26 crc kubenswrapper[4778]: W0318 10:59:26.688733 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3db5e33d_384f_4df3_bfb8_ba279b83f7e4.slice/crio-681981faadfade7153f4a024a87d8045784570ab4504a46dfdbd1ca36458b970 WatchSource:0}: Error finding container 681981faadfade7153f4a024a87d8045784570ab4504a46dfdbd1ca36458b970: Status 404 returned error can't find the container with id 681981faadfade7153f4a024a87d8045784570ab4504a46dfdbd1ca36458b970 Mar 18 10:59:26 crc kubenswrapper[4778]: E0318 10:59:26.690245 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 10:59:27 crc kubenswrapper[4778]: E0318 10:59:27.121501 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 10:59:27 crc kubenswrapper[4778]: I0318 10:59:27.171946 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h6pkp" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="registry-server" probeResult="failure" output=< Mar 18 10:59:27 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:59:27 crc kubenswrapper[4778]: > Mar 18 10:59:27 crc kubenswrapper[4778]: I0318 10:59:27.561923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" event={"ID":"3db5e33d-384f-4df3-bfb8-ba279b83f7e4","Type":"ContainerStarted","Data":"937f8a9872fea869494a13c13092a7d2831b1396fbc8d1a4968641e7cbe150fc"} Mar 18 10:59:27 crc kubenswrapper[4778]: I0318 10:59:27.562338 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" event={"ID":"3db5e33d-384f-4df3-bfb8-ba279b83f7e4","Type":"ContainerStarted","Data":"681981faadfade7153f4a024a87d8045784570ab4504a46dfdbd1ca36458b970"} Mar 18 10:59:27 crc kubenswrapper[4778]: E0318 10:59:27.562582 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 10:59:27 crc kubenswrapper[4778]: I0318 10:59:27.577967 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" podStartSLOduration=2.148187717 podStartE2EDuration="2.577947737s" podCreationTimestamp="2026-03-18 10:59:25 +0000 UTC" firstStartedPulling="2026-03-18 10:59:26.691572928 +0000 UTC m=+7033.266317808" lastFinishedPulling="2026-03-18 10:59:27.121332968 +0000 UTC m=+7033.696077828" observedRunningTime="2026-03-18 10:59:27.57436956 +0000 UTC m=+7034.149114400" watchObservedRunningTime="2026-03-18 10:59:27.577947737 +0000 UTC m=+7034.152692597" Mar 18 10:59:28 crc kubenswrapper[4778]: E0318 10:59:28.570111 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 10:59:36 crc kubenswrapper[4778]: I0318 10:59:36.165977 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:36 crc kubenswrapper[4778]: I0318 10:59:36.218922 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:36 crc kubenswrapper[4778]: I0318 10:59:36.412257 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h6pkp"] Mar 18 10:59:37 crc kubenswrapper[4778]: I0318 10:59:37.653486 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h6pkp" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="registry-server" containerID="cri-o://9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd" gracePeriod=2 Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.186167 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.314728 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-utilities\") pod \"752de958-6cfc-4ceb-84c4-006b0719f0a5\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.314984 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-catalog-content\") pod \"752de958-6cfc-4ceb-84c4-006b0719f0a5\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.315032 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m8r2\" (UniqueName: \"kubernetes.io/projected/752de958-6cfc-4ceb-84c4-006b0719f0a5-kube-api-access-7m8r2\") pod \"752de958-6cfc-4ceb-84c4-006b0719f0a5\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.316575 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-utilities" (OuterVolumeSpecName: "utilities") pod "752de958-6cfc-4ceb-84c4-006b0719f0a5" (UID: "752de958-6cfc-4ceb-84c4-006b0719f0a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.320211 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/752de958-6cfc-4ceb-84c4-006b0719f0a5-kube-api-access-7m8r2" (OuterVolumeSpecName: "kube-api-access-7m8r2") pod "752de958-6cfc-4ceb-84c4-006b0719f0a5" (UID: "752de958-6cfc-4ceb-84c4-006b0719f0a5"). InnerVolumeSpecName "kube-api-access-7m8r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.417382 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.417744 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m8r2\" (UniqueName: \"kubernetes.io/projected/752de958-6cfc-4ceb-84c4-006b0719f0a5-kube-api-access-7m8r2\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.456573 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "752de958-6cfc-4ceb-84c4-006b0719f0a5" (UID: "752de958-6cfc-4ceb-84c4-006b0719f0a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.519360 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.664883 4778 generic.go:334] "Generic (PLEG): container finished" podID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerID="9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd" exitCode=0 Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.664931 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerDied","Data":"9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd"} Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.664962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerDied","Data":"1a5621a90cc4698a3bf1daf5ec403ed97712aae84421056cd21a797fac8ef3b4"} Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.664981 4778 scope.go:117] "RemoveContainer" containerID="9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.665125 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.707180 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h6pkp"] Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.710030 4778 scope.go:117] "RemoveContainer" containerID="f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.714800 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h6pkp"] Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.736390 4778 scope.go:117] "RemoveContainer" containerID="8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.809259 4778 scope.go:117] "RemoveContainer" containerID="9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd" Mar 18 10:59:38 crc kubenswrapper[4778]: E0318 10:59:38.809903 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd\": container with ID starting with 9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd not found: ID does not exist" containerID="9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.810136 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd"} err="failed to get container status \"9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd\": rpc error: code = NotFound desc = could not find container \"9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd\": container with ID starting with 9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd not found: ID does not exist" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.810178 4778 scope.go:117] "RemoveContainer" containerID="f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e" Mar 18 10:59:38 crc kubenswrapper[4778]: E0318 10:59:38.810790 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e\": container with ID starting with f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e not found: ID does not exist" containerID="f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.810816 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e"} err="failed to get container status \"f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e\": rpc error: code = NotFound desc = could not find container \"f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e\": container with ID starting with f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e not found: ID does not exist" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.810831 4778 scope.go:117] "RemoveContainer" containerID="8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5" Mar 18 10:59:38 crc kubenswrapper[4778]: E0318 10:59:38.811274 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5\": container with ID starting with 8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5 not found: ID does not exist" containerID="8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.811320 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5"} err="failed to get container status \"8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5\": rpc error: code = NotFound desc = could not find container \"8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5\": container with ID starting with 8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5 not found: ID does not exist" Mar 18 10:59:40 crc kubenswrapper[4778]: I0318 10:59:40.209721 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" path="/var/lib/kubelet/pods/752de958-6cfc-4ceb-84c4-006b0719f0a5/volumes" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.723972 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n6w9k/must-gather-5mjwn"] Mar 18 10:59:51 crc kubenswrapper[4778]: E0318 10:59:51.724997 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="extract-content" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.725009 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="extract-content" Mar 18 10:59:51 crc kubenswrapper[4778]: E0318 10:59:51.725028 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="extract-utilities" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.725035 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="extract-utilities" Mar 18 10:59:51 crc kubenswrapper[4778]: E0318 10:59:51.725048 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="registry-server" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.725054 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="registry-server" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.725317 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="registry-server" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.726501 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.728726 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n6w9k"/"openshift-service-ca.crt" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.729047 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n6w9k"/"default-dockercfg-88grs" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.732401 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n6w9k/must-gather-5mjwn"] Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.736466 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n6w9k"/"kube-root-ca.crt" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.840939 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zvb6\" (UniqueName: \"kubernetes.io/projected/2c2e2094-7c48-4653-8b53-95483d470344-kube-api-access-6zvb6\") pod \"must-gather-5mjwn\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.841020 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c2e2094-7c48-4653-8b53-95483d470344-must-gather-output\") pod \"must-gather-5mjwn\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.943079 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zvb6\" (UniqueName: \"kubernetes.io/projected/2c2e2094-7c48-4653-8b53-95483d470344-kube-api-access-6zvb6\") pod \"must-gather-5mjwn\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.943513 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c2e2094-7c48-4653-8b53-95483d470344-must-gather-output\") pod \"must-gather-5mjwn\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.944416 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c2e2094-7c48-4653-8b53-95483d470344-must-gather-output\") pod \"must-gather-5mjwn\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.973489 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zvb6\" (UniqueName: \"kubernetes.io/projected/2c2e2094-7c48-4653-8b53-95483d470344-kube-api-access-6zvb6\") pod \"must-gather-5mjwn\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:52 crc kubenswrapper[4778]: I0318 10:59:52.046005 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:52 crc kubenswrapper[4778]: I0318 10:59:52.598835 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n6w9k/must-gather-5mjwn"] Mar 18 10:59:52 crc kubenswrapper[4778]: I0318 10:59:52.853531 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" event={"ID":"2c2e2094-7c48-4653-8b53-95483d470344","Type":"ContainerStarted","Data":"11e1f6a8bb07bab9c0aacab6604201e86370f0dcd8ed040feff79260d58c73a3"} Mar 18 10:59:59 crc kubenswrapper[4778]: I0318 10:59:59.909350 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" event={"ID":"2c2e2094-7c48-4653-8b53-95483d470344","Type":"ContainerStarted","Data":"e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec"} Mar 18 10:59:59 crc kubenswrapper[4778]: I0318 10:59:59.910008 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" event={"ID":"2c2e2094-7c48-4653-8b53-95483d470344","Type":"ContainerStarted","Data":"8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa"} Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.146361 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" podStartSLOduration=2.85881927 podStartE2EDuration="9.14634151s" podCreationTimestamp="2026-03-18 10:59:51 +0000 UTC" firstStartedPulling="2026-03-18 10:59:52.594251759 +0000 UTC m=+7059.168996599" lastFinishedPulling="2026-03-18 10:59:58.881773999 +0000 UTC m=+7065.456518839" observedRunningTime="2026-03-18 10:59:59.926521926 +0000 UTC m=+7066.501266766" watchObservedRunningTime="2026-03-18 11:00:00.14634151 +0000 UTC m=+7066.721086350" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.150732 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563860-9p79f"] Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.152682 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.155806 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.155837 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.156872 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.160978 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr"] Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.162683 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.164389 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.165440 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.181034 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr"] Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.216333 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a112dd3e-72a0-48ea-a69c-448090520236-config-volume\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.216405 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a112dd3e-72a0-48ea-a69c-448090520236-secret-volume\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.216506 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t76pg\" (UniqueName: \"kubernetes.io/projected/a112dd3e-72a0-48ea-a69c-448090520236-kube-api-access-t76pg\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.216547 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c2v2\" (UniqueName: \"kubernetes.io/projected/9bbe37de-66b2-4c42-a72f-92155eb2edb9-kube-api-access-9c2v2\") pod \"auto-csr-approver-29563860-9p79f\" (UID: \"9bbe37de-66b2-4c42-a72f-92155eb2edb9\") " pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.266761 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563860-9p79f"] Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.317978 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a112dd3e-72a0-48ea-a69c-448090520236-secret-volume\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.318135 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t76pg\" (UniqueName: \"kubernetes.io/projected/a112dd3e-72a0-48ea-a69c-448090520236-kube-api-access-t76pg\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.318209 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c2v2\" (UniqueName: \"kubernetes.io/projected/9bbe37de-66b2-4c42-a72f-92155eb2edb9-kube-api-access-9c2v2\") pod \"auto-csr-approver-29563860-9p79f\" (UID: \"9bbe37de-66b2-4c42-a72f-92155eb2edb9\") " pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.318330 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a112dd3e-72a0-48ea-a69c-448090520236-config-volume\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.319476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a112dd3e-72a0-48ea-a69c-448090520236-config-volume\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.330909 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a112dd3e-72a0-48ea-a69c-448090520236-secret-volume\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.334038 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t76pg\" (UniqueName: \"kubernetes.io/projected/a112dd3e-72a0-48ea-a69c-448090520236-kube-api-access-t76pg\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.334992 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c2v2\" (UniqueName: \"kubernetes.io/projected/9bbe37de-66b2-4c42-a72f-92155eb2edb9-kube-api-access-9c2v2\") pod \"auto-csr-approver-29563860-9p79f\" (UID: \"9bbe37de-66b2-4c42-a72f-92155eb2edb9\") " pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.480253 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.508593 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.965124 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563860-9p79f"] Mar 18 11:00:01 crc kubenswrapper[4778]: I0318 11:00:01.053384 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr"] Mar 18 11:00:01 crc kubenswrapper[4778]: W0318 11:00:01.053634 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda112dd3e_72a0_48ea_a69c_448090520236.slice/crio-a8cfd153fe3e22eb43dc06184206855dd94fe8c85cc237bd5028b0ea5da00d1b WatchSource:0}: Error finding container a8cfd153fe3e22eb43dc06184206855dd94fe8c85cc237bd5028b0ea5da00d1b: Status 404 returned error can't find the container with id a8cfd153fe3e22eb43dc06184206855dd94fe8c85cc237bd5028b0ea5da00d1b Mar 18 11:00:01 crc kubenswrapper[4778]: I0318 11:00:01.932367 4778 generic.go:334] "Generic (PLEG): container finished" podID="a112dd3e-72a0-48ea-a69c-448090520236" containerID="8bd79fe1dda5f124cb0c95449837cc849410b83946a1ead7147f36344ba1810d" exitCode=0 Mar 18 11:00:01 crc kubenswrapper[4778]: I0318 11:00:01.932487 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" event={"ID":"a112dd3e-72a0-48ea-a69c-448090520236","Type":"ContainerDied","Data":"8bd79fe1dda5f124cb0c95449837cc849410b83946a1ead7147f36344ba1810d"} Mar 18 11:00:01 crc kubenswrapper[4778]: I0318 11:00:01.932818 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" event={"ID":"a112dd3e-72a0-48ea-a69c-448090520236","Type":"ContainerStarted","Data":"a8cfd153fe3e22eb43dc06184206855dd94fe8c85cc237bd5028b0ea5da00d1b"} Mar 18 11:00:01 crc kubenswrapper[4778]: I0318 11:00:01.934018 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563860-9p79f" event={"ID":"9bbe37de-66b2-4c42-a72f-92155eb2edb9","Type":"ContainerStarted","Data":"8afcfa7ffbe12d4a317cb693519716b9c3977186844e3596a5ac00ca0f3c4061"} Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.364569 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.381039 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t76pg\" (UniqueName: \"kubernetes.io/projected/a112dd3e-72a0-48ea-a69c-448090520236-kube-api-access-t76pg\") pod \"a112dd3e-72a0-48ea-a69c-448090520236\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.381437 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a112dd3e-72a0-48ea-a69c-448090520236-config-volume\") pod \"a112dd3e-72a0-48ea-a69c-448090520236\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.381501 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a112dd3e-72a0-48ea-a69c-448090520236-secret-volume\") pod \"a112dd3e-72a0-48ea-a69c-448090520236\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.382728 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a112dd3e-72a0-48ea-a69c-448090520236-config-volume" (OuterVolumeSpecName: "config-volume") pod "a112dd3e-72a0-48ea-a69c-448090520236" (UID: "a112dd3e-72a0-48ea-a69c-448090520236"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.390110 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a112dd3e-72a0-48ea-a69c-448090520236-kube-api-access-t76pg" (OuterVolumeSpecName: "kube-api-access-t76pg") pod "a112dd3e-72a0-48ea-a69c-448090520236" (UID: "a112dd3e-72a0-48ea-a69c-448090520236"). InnerVolumeSpecName "kube-api-access-t76pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.391273 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a112dd3e-72a0-48ea-a69c-448090520236-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a112dd3e-72a0-48ea-a69c-448090520236" (UID: "a112dd3e-72a0-48ea-a69c-448090520236"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.487922 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t76pg\" (UniqueName: \"kubernetes.io/projected/a112dd3e-72a0-48ea-a69c-448090520236-kube-api-access-t76pg\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.487974 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a112dd3e-72a0-48ea-a69c-448090520236-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.487992 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a112dd3e-72a0-48ea-a69c-448090520236-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.954933 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" event={"ID":"a112dd3e-72a0-48ea-a69c-448090520236","Type":"ContainerDied","Data":"a8cfd153fe3e22eb43dc06184206855dd94fe8c85cc237bd5028b0ea5da00d1b"} Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.955239 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8cfd153fe3e22eb43dc06184206855dd94fe8c85cc237bd5028b0ea5da00d1b" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.955016 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:04 crc kubenswrapper[4778]: I0318 11:00:04.432237 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf"] Mar 18 11:00:04 crc kubenswrapper[4778]: I0318 11:00:04.441688 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf"] Mar 18 11:00:04 crc kubenswrapper[4778]: I0318 11:00:04.871127 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-cmwv6"] Mar 18 11:00:04 crc kubenswrapper[4778]: E0318 11:00:04.871615 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a112dd3e-72a0-48ea-a69c-448090520236" containerName="collect-profiles" Mar 18 11:00:04 crc kubenswrapper[4778]: I0318 11:00:04.871631 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a112dd3e-72a0-48ea-a69c-448090520236" containerName="collect-profiles" Mar 18 11:00:04 crc kubenswrapper[4778]: I0318 11:00:04.871800 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a112dd3e-72a0-48ea-a69c-448090520236" containerName="collect-profiles" Mar 18 11:00:04 crc kubenswrapper[4778]: I0318 11:00:04.872411 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.021748 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-host\") pod \"crc-debug-cmwv6\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.021810 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbgjp\" (UniqueName: \"kubernetes.io/projected/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-kube-api-access-dbgjp\") pod \"crc-debug-cmwv6\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.123740 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-host\") pod \"crc-debug-cmwv6\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.124085 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbgjp\" (UniqueName: \"kubernetes.io/projected/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-kube-api-access-dbgjp\") pod \"crc-debug-cmwv6\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.124574 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-host\") pod \"crc-debug-cmwv6\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.144852 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbgjp\" (UniqueName: \"kubernetes.io/projected/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-kube-api-access-dbgjp\") pod \"crc-debug-cmwv6\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.193403 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.972377 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" event={"ID":"c15df1c1-2c25-4e82-9933-ada0bd8d6d73","Type":"ContainerStarted","Data":"7b962272cb27c5f171848527e65fc41e0d889940197edbf9741702f400418883"} Mar 18 11:00:06 crc kubenswrapper[4778]: I0318 11:00:06.203328 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a12e64d-d433-4f42-8aa6-cd1de264b346" path="/var/lib/kubelet/pods/1a12e64d-d433-4f42-8aa6-cd1de264b346/volumes" Mar 18 11:00:18 crc kubenswrapper[4778]: I0318 11:00:18.112228 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" event={"ID":"c15df1c1-2c25-4e82-9933-ada0bd8d6d73","Type":"ContainerStarted","Data":"3fd6d1f0823c046b310eb8beb463813250215ced92cb6e72ad8093250484ff70"} Mar 18 11:00:18 crc kubenswrapper[4778]: I0318 11:00:18.142482 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" podStartSLOduration=2.478719784 podStartE2EDuration="14.142464285s" podCreationTimestamp="2026-03-18 11:00:04 +0000 UTC" firstStartedPulling="2026-03-18 11:00:05.267958393 +0000 UTC m=+7071.842703233" lastFinishedPulling="2026-03-18 11:00:16.931702894 +0000 UTC m=+7083.506447734" observedRunningTime="2026-03-18 11:00:18.134036016 +0000 UTC m=+7084.708780886" watchObservedRunningTime="2026-03-18 11:00:18.142464285 +0000 UTC m=+7084.717209125" Mar 18 11:00:19 crc kubenswrapper[4778]: I0318 11:00:19.122116 4778 generic.go:334] "Generic (PLEG): container finished" podID="9bbe37de-66b2-4c42-a72f-92155eb2edb9" containerID="1bb8821bd2c18d4bb5e7f9c4c0784d606dc27180e5e74bcaf381cd0d404e43fd" exitCode=0 Mar 18 11:00:19 crc kubenswrapper[4778]: I0318 11:00:19.122165 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563860-9p79f" event={"ID":"9bbe37de-66b2-4c42-a72f-92155eb2edb9","Type":"ContainerDied","Data":"1bb8821bd2c18d4bb5e7f9c4c0784d606dc27180e5e74bcaf381cd0d404e43fd"} Mar 18 11:00:20 crc kubenswrapper[4778]: I0318 11:00:20.455010 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:20 crc kubenswrapper[4778]: I0318 11:00:20.643473 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c2v2\" (UniqueName: \"kubernetes.io/projected/9bbe37de-66b2-4c42-a72f-92155eb2edb9-kube-api-access-9c2v2\") pod \"9bbe37de-66b2-4c42-a72f-92155eb2edb9\" (UID: \"9bbe37de-66b2-4c42-a72f-92155eb2edb9\") " Mar 18 11:00:20 crc kubenswrapper[4778]: I0318 11:00:20.653112 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bbe37de-66b2-4c42-a72f-92155eb2edb9-kube-api-access-9c2v2" (OuterVolumeSpecName: "kube-api-access-9c2v2") pod "9bbe37de-66b2-4c42-a72f-92155eb2edb9" (UID: "9bbe37de-66b2-4c42-a72f-92155eb2edb9"). InnerVolumeSpecName "kube-api-access-9c2v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:00:20 crc kubenswrapper[4778]: I0318 11:00:20.746704 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c2v2\" (UniqueName: \"kubernetes.io/projected/9bbe37de-66b2-4c42-a72f-92155eb2edb9-kube-api-access-9c2v2\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:21 crc kubenswrapper[4778]: I0318 11:00:21.140658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563860-9p79f" event={"ID":"9bbe37de-66b2-4c42-a72f-92155eb2edb9","Type":"ContainerDied","Data":"8afcfa7ffbe12d4a317cb693519716b9c3977186844e3596a5ac00ca0f3c4061"} Mar 18 11:00:21 crc kubenswrapper[4778]: I0318 11:00:21.140934 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8afcfa7ffbe12d4a317cb693519716b9c3977186844e3596a5ac00ca0f3c4061" Mar 18 11:00:21 crc kubenswrapper[4778]: I0318 11:00:21.140716 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:21 crc kubenswrapper[4778]: I0318 11:00:21.528136 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563854-qqn9z"] Mar 18 11:00:21 crc kubenswrapper[4778]: I0318 11:00:21.537106 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563854-qqn9z"] Mar 18 11:00:22 crc kubenswrapper[4778]: I0318 11:00:22.197390 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6103ea7-c41a-40d2-ae16-15f066c955b9" path="/var/lib/kubelet/pods/f6103ea7-c41a-40d2-ae16-15f066c955b9/volumes" Mar 18 11:00:37 crc kubenswrapper[4778]: I0318 11:00:37.733562 4778 scope.go:117] "RemoveContainer" containerID="20d0876a2852471421fd6830f32cec9b8955b6abdc480edeb7e2a46c81a72c97" Mar 18 11:00:37 crc kubenswrapper[4778]: I0318 11:00:37.782820 4778 scope.go:117] "RemoveContainer" containerID="4c308b5ed19066acb80d31a8263c3f25bb04c0935256cdfa497ae0b275b40ad3" Mar 18 11:00:42 crc kubenswrapper[4778]: E0318 11:00:42.187774 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.147237 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.148928 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.154217 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29563861-czdsg"] Mar 18 11:01:00 crc kubenswrapper[4778]: E0318 11:01:00.154654 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbe37de-66b2-4c42-a72f-92155eb2edb9" containerName="oc" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.154676 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbe37de-66b2-4c42-a72f-92155eb2edb9" containerName="oc" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.154927 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bbe37de-66b2-4c42-a72f-92155eb2edb9" containerName="oc" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.155687 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.163999 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563861-czdsg"] Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.337479 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbgmk\" (UniqueName: \"kubernetes.io/projected/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-kube-api-access-sbgmk\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.337581 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-combined-ca-bundle\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.337707 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-fernet-keys\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.337735 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-config-data\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.439066 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-fernet-keys\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.439118 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-config-data\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.439192 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbgmk\" (UniqueName: \"kubernetes.io/projected/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-kube-api-access-sbgmk\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.439243 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-combined-ca-bundle\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.444834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-fernet-keys\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.445190 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-combined-ca-bundle\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.446649 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-config-data\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.464531 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbgmk\" (UniqueName: \"kubernetes.io/projected/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-kube-api-access-sbgmk\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.475214 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.982397 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563861-czdsg"] Mar 18 11:01:01 crc kubenswrapper[4778]: I0318 11:01:01.546864 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563861-czdsg" event={"ID":"d34b9add-0199-4bf1-81f8-fa4c2a9138e7","Type":"ContainerStarted","Data":"f984a2631cb84423ec49d0a74a80858b922aaf45ecd4a6618c404c384cca758e"} Mar 18 11:01:01 crc kubenswrapper[4778]: I0318 11:01:01.547215 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563861-czdsg" event={"ID":"d34b9add-0199-4bf1-81f8-fa4c2a9138e7","Type":"ContainerStarted","Data":"2316c1f8492c44ecfcb6c8739c1c26728d3712b4d7998cdb3a69536131c4537f"} Mar 18 11:01:01 crc kubenswrapper[4778]: I0318 11:01:01.563288 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29563861-czdsg" podStartSLOduration=1.563268566 podStartE2EDuration="1.563268566s" podCreationTimestamp="2026-03-18 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:01:01.558917967 +0000 UTC m=+7128.133662837" watchObservedRunningTime="2026-03-18 11:01:01.563268566 +0000 UTC m=+7128.138013406" Mar 18 11:01:03 crc kubenswrapper[4778]: I0318 11:01:03.565230 4778 generic.go:334] "Generic (PLEG): container finished" podID="c15df1c1-2c25-4e82-9933-ada0bd8d6d73" containerID="3fd6d1f0823c046b310eb8beb463813250215ced92cb6e72ad8093250484ff70" exitCode=0 Mar 18 11:01:03 crc kubenswrapper[4778]: I0318 11:01:03.565335 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" event={"ID":"c15df1c1-2c25-4e82-9933-ada0bd8d6d73","Type":"ContainerDied","Data":"3fd6d1f0823c046b310eb8beb463813250215ced92cb6e72ad8093250484ff70"} Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.575052 4778 generic.go:334] "Generic (PLEG): container finished" podID="d34b9add-0199-4bf1-81f8-fa4c2a9138e7" containerID="f984a2631cb84423ec49d0a74a80858b922aaf45ecd4a6618c404c384cca758e" exitCode=0 Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.575235 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563861-czdsg" event={"ID":"d34b9add-0199-4bf1-81f8-fa4c2a9138e7","Type":"ContainerDied","Data":"f984a2631cb84423ec49d0a74a80858b922aaf45ecd4a6618c404c384cca758e"} Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.706414 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.763132 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-cmwv6"] Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.775496 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-cmwv6"] Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.825490 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-host\") pod \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.825585 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-host" (OuterVolumeSpecName: "host") pod "c15df1c1-2c25-4e82-9933-ada0bd8d6d73" (UID: "c15df1c1-2c25-4e82-9933-ada0bd8d6d73"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.825730 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbgjp\" (UniqueName: \"kubernetes.io/projected/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-kube-api-access-dbgjp\") pod \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.826272 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.831500 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-kube-api-access-dbgjp" (OuterVolumeSpecName: "kube-api-access-dbgjp") pod "c15df1c1-2c25-4e82-9933-ada0bd8d6d73" (UID: "c15df1c1-2c25-4e82-9933-ada0bd8d6d73"). InnerVolumeSpecName "kube-api-access-dbgjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.928575 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbgjp\" (UniqueName: \"kubernetes.io/projected/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-kube-api-access-dbgjp\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.584949 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b962272cb27c5f171848527e65fc41e0d889940197edbf9741702f400418883" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.585050 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.933662 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.987682 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-gvcbm"] Mar 18 11:01:05 crc kubenswrapper[4778]: E0318 11:01:05.988189 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c15df1c1-2c25-4e82-9933-ada0bd8d6d73" containerName="container-00" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.988232 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c15df1c1-2c25-4e82-9933-ada0bd8d6d73" containerName="container-00" Mar 18 11:01:05 crc kubenswrapper[4778]: E0318 11:01:05.988279 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34b9add-0199-4bf1-81f8-fa4c2a9138e7" containerName="keystone-cron" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.988290 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34b9add-0199-4bf1-81f8-fa4c2a9138e7" containerName="keystone-cron" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.988509 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c15df1c1-2c25-4e82-9933-ada0bd8d6d73" containerName="container-00" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.988539 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34b9add-0199-4bf1-81f8-fa4c2a9138e7" containerName="keystone-cron" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.989405 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.061293 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-config-data\") pod \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.061349 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-fernet-keys\") pod \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.062369 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-combined-ca-bundle\") pod \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.062420 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbgmk\" (UniqueName: \"kubernetes.io/projected/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-kube-api-access-sbgmk\") pod \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.066952 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d34b9add-0199-4bf1-81f8-fa4c2a9138e7" (UID: "d34b9add-0199-4bf1-81f8-fa4c2a9138e7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.068792 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-kube-api-access-sbgmk" (OuterVolumeSpecName: "kube-api-access-sbgmk") pod "d34b9add-0199-4bf1-81f8-fa4c2a9138e7" (UID: "d34b9add-0199-4bf1-81f8-fa4c2a9138e7"). InnerVolumeSpecName "kube-api-access-sbgmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.098228 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d34b9add-0199-4bf1-81f8-fa4c2a9138e7" (UID: "d34b9add-0199-4bf1-81f8-fa4c2a9138e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.111090 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-config-data" (OuterVolumeSpecName: "config-data") pod "d34b9add-0199-4bf1-81f8-fa4c2a9138e7" (UID: "d34b9add-0199-4bf1-81f8-fa4c2a9138e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.164711 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpg7b\" (UniqueName: \"kubernetes.io/projected/0f322526-d81e-4a2e-a084-151cb4304b64-kube-api-access-fpg7b\") pod \"crc-debug-gvcbm\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.164996 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f322526-d81e-4a2e-a084-151cb4304b64-host\") pod \"crc-debug-gvcbm\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.165104 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.165119 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbgmk\" (UniqueName: \"kubernetes.io/projected/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-kube-api-access-sbgmk\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.165145 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.165152 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.206179 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c15df1c1-2c25-4e82-9933-ada0bd8d6d73" path="/var/lib/kubelet/pods/c15df1c1-2c25-4e82-9933-ada0bd8d6d73/volumes" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.266990 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f322526-d81e-4a2e-a084-151cb4304b64-host\") pod \"crc-debug-gvcbm\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.267044 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpg7b\" (UniqueName: \"kubernetes.io/projected/0f322526-d81e-4a2e-a084-151cb4304b64-kube-api-access-fpg7b\") pod \"crc-debug-gvcbm\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.267186 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f322526-d81e-4a2e-a084-151cb4304b64-host\") pod \"crc-debug-gvcbm\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.298286 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpg7b\" (UniqueName: \"kubernetes.io/projected/0f322526-d81e-4a2e-a084-151cb4304b64-kube-api-access-fpg7b\") pod \"crc-debug-gvcbm\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.315608 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.596387 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" event={"ID":"0f322526-d81e-4a2e-a084-151cb4304b64","Type":"ContainerStarted","Data":"4885ccf444eddc8a6ce122274d5f1476536fe5db0e3a880ac80a8141660c536e"} Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.596724 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" event={"ID":"0f322526-d81e-4a2e-a084-151cb4304b64","Type":"ContainerStarted","Data":"eafbdfb496b252d70cfdde0e33b23047271e7bcbfeb28f1dc5fe3cb1ad3e962d"} Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.599053 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563861-czdsg" event={"ID":"d34b9add-0199-4bf1-81f8-fa4c2a9138e7","Type":"ContainerDied","Data":"2316c1f8492c44ecfcb6c8739c1c26728d3712b4d7998cdb3a69536131c4537f"} Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.599091 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2316c1f8492c44ecfcb6c8739c1c26728d3712b4d7998cdb3a69536131c4537f" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.599126 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.622570 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" podStartSLOduration=1.622550769 podStartE2EDuration="1.622550769s" podCreationTimestamp="2026-03-18 11:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:01:06.614347836 +0000 UTC m=+7133.189092706" watchObservedRunningTime="2026-03-18 11:01:06.622550769 +0000 UTC m=+7133.197295609" Mar 18 11:01:07 crc kubenswrapper[4778]: I0318 11:01:07.609104 4778 generic.go:334] "Generic (PLEG): container finished" podID="0f322526-d81e-4a2e-a084-151cb4304b64" containerID="4885ccf444eddc8a6ce122274d5f1476536fe5db0e3a880ac80a8141660c536e" exitCode=0 Mar 18 11:01:07 crc kubenswrapper[4778]: I0318 11:01:07.609135 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" event={"ID":"0f322526-d81e-4a2e-a084-151cb4304b64","Type":"ContainerDied","Data":"4885ccf444eddc8a6ce122274d5f1476536fe5db0e3a880ac80a8141660c536e"} Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.716167 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.817833 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f322526-d81e-4a2e-a084-151cb4304b64-host\") pod \"0f322526-d81e-4a2e-a084-151cb4304b64\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.817936 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpg7b\" (UniqueName: \"kubernetes.io/projected/0f322526-d81e-4a2e-a084-151cb4304b64-kube-api-access-fpg7b\") pod \"0f322526-d81e-4a2e-a084-151cb4304b64\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.819264 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f322526-d81e-4a2e-a084-151cb4304b64-host" (OuterVolumeSpecName: "host") pod "0f322526-d81e-4a2e-a084-151cb4304b64" (UID: "0f322526-d81e-4a2e-a084-151cb4304b64"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.828841 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f322526-d81e-4a2e-a084-151cb4304b64-kube-api-access-fpg7b" (OuterVolumeSpecName: "kube-api-access-fpg7b") pod "0f322526-d81e-4a2e-a084-151cb4304b64" (UID: "0f322526-d81e-4a2e-a084-151cb4304b64"). InnerVolumeSpecName "kube-api-access-fpg7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.920237 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f322526-d81e-4a2e-a084-151cb4304b64-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.920273 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpg7b\" (UniqueName: \"kubernetes.io/projected/0f322526-d81e-4a2e-a084-151cb4304b64-kube-api-access-fpg7b\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:09 crc kubenswrapper[4778]: I0318 11:01:09.250008 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-gvcbm"] Mar 18 11:01:09 crc kubenswrapper[4778]: I0318 11:01:09.265593 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-gvcbm"] Mar 18 11:01:09 crc kubenswrapper[4778]: I0318 11:01:09.625365 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eafbdfb496b252d70cfdde0e33b23047271e7bcbfeb28f1dc5fe3cb1ad3e962d" Mar 18 11:01:09 crc kubenswrapper[4778]: I0318 11:01:09.625497 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.207067 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f322526-d81e-4a2e-a084-151cb4304b64" path="/var/lib/kubelet/pods/0f322526-d81e-4a2e-a084-151cb4304b64/volumes" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.438584 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-2nws9"] Mar 18 11:01:10 crc kubenswrapper[4778]: E0318 11:01:10.439965 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f322526-d81e-4a2e-a084-151cb4304b64" containerName="container-00" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.440120 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f322526-d81e-4a2e-a084-151cb4304b64" containerName="container-00" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.440548 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f322526-d81e-4a2e-a084-151cb4304b64" containerName="container-00" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.441879 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.454168 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcq5b\" (UniqueName: \"kubernetes.io/projected/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-kube-api-access-bcq5b\") pod \"crc-debug-2nws9\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.454278 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-host\") pod \"crc-debug-2nws9\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.556043 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcq5b\" (UniqueName: \"kubernetes.io/projected/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-kube-api-access-bcq5b\") pod \"crc-debug-2nws9\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.556480 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-host\") pod \"crc-debug-2nws9\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.556715 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-host\") pod \"crc-debug-2nws9\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.592776 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcq5b\" (UniqueName: \"kubernetes.io/projected/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-kube-api-access-bcq5b\") pod \"crc-debug-2nws9\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.767580 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: W0318 11:01:10.799467 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7738bdaf_c632_4ce0_b83d_cb4d38c4760a.slice/crio-807c35c264101c5aee2828e057a54c122affe04218766c98c693137411008edc WatchSource:0}: Error finding container 807c35c264101c5aee2828e057a54c122affe04218766c98c693137411008edc: Status 404 returned error can't find the container with id 807c35c264101c5aee2828e057a54c122affe04218766c98c693137411008edc Mar 18 11:01:11 crc kubenswrapper[4778]: I0318 11:01:11.648298 4778 generic.go:334] "Generic (PLEG): container finished" podID="7738bdaf-c632-4ce0-b83d-cb4d38c4760a" containerID="f5477be7293a466e680a1b2c883902065f2398c01a4cd1962f22034d098fc2a3" exitCode=0 Mar 18 11:01:11 crc kubenswrapper[4778]: I0318 11:01:11.648373 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-2nws9" event={"ID":"7738bdaf-c632-4ce0-b83d-cb4d38c4760a","Type":"ContainerDied","Data":"f5477be7293a466e680a1b2c883902065f2398c01a4cd1962f22034d098fc2a3"} Mar 18 11:01:11 crc kubenswrapper[4778]: I0318 11:01:11.648593 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-2nws9" event={"ID":"7738bdaf-c632-4ce0-b83d-cb4d38c4760a","Type":"ContainerStarted","Data":"807c35c264101c5aee2828e057a54c122affe04218766c98c693137411008edc"} Mar 18 11:01:11 crc kubenswrapper[4778]: I0318 11:01:11.694173 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-2nws9"] Mar 18 11:01:11 crc kubenswrapper[4778]: I0318 11:01:11.702795 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-2nws9"] Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.762175 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.803680 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcq5b\" (UniqueName: \"kubernetes.io/projected/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-kube-api-access-bcq5b\") pod \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.803860 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-host\") pod \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.804081 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-host" (OuterVolumeSpecName: "host") pod "7738bdaf-c632-4ce0-b83d-cb4d38c4760a" (UID: "7738bdaf-c632-4ce0-b83d-cb4d38c4760a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.805313 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.813641 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-kube-api-access-bcq5b" (OuterVolumeSpecName: "kube-api-access-bcq5b") pod "7738bdaf-c632-4ce0-b83d-cb4d38c4760a" (UID: "7738bdaf-c632-4ce0-b83d-cb4d38c4760a"). InnerVolumeSpecName "kube-api-access-bcq5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.907031 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcq5b\" (UniqueName: \"kubernetes.io/projected/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-kube-api-access-bcq5b\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:13 crc kubenswrapper[4778]: I0318 11:01:13.672727 4778 scope.go:117] "RemoveContainer" containerID="f5477be7293a466e680a1b2c883902065f2398c01a4cd1962f22034d098fc2a3" Mar 18 11:01:13 crc kubenswrapper[4778]: I0318 11:01:13.672791 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:14 crc kubenswrapper[4778]: I0318 11:01:14.217792 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7738bdaf-c632-4ce0-b83d-cb4d38c4760a" path="/var/lib/kubelet/pods/7738bdaf-c632-4ce0-b83d-cb4d38c4760a/volumes" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.803649 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2hjj7"] Mar 18 11:01:29 crc kubenswrapper[4778]: E0318 11:01:29.804796 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7738bdaf-c632-4ce0-b83d-cb4d38c4760a" containerName="container-00" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.804810 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7738bdaf-c632-4ce0-b83d-cb4d38c4760a" containerName="container-00" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.805022 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7738bdaf-c632-4ce0-b83d-cb4d38c4760a" containerName="container-00" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.806511 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.826630 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hjj7"] Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.982946 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wxbd\" (UniqueName: \"kubernetes.io/projected/abcddae8-6cd1-4a48-b133-af298a8fc9bb-kube-api-access-5wxbd\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.983083 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-utilities\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.983159 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-catalog-content\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.084925 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wxbd\" (UniqueName: \"kubernetes.io/projected/abcddae8-6cd1-4a48-b133-af298a8fc9bb-kube-api-access-5wxbd\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.085262 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-utilities\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.085333 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-catalog-content\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.085844 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-utilities\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.085940 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-catalog-content\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.114082 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wxbd\" (UniqueName: \"kubernetes.io/projected/abcddae8-6cd1-4a48-b133-af298a8fc9bb-kube-api-access-5wxbd\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.139778 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.147773 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.147818 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.649572 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hjj7"] Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.858712 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerStarted","Data":"6796f39f1936165c0d34446a4399a251eaff83374f66c18a58cb0c062de2237f"} Mar 18 11:01:31 crc kubenswrapper[4778]: I0318 11:01:31.870037 4778 generic.go:334] "Generic (PLEG): container finished" podID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerID="8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984" exitCode=0 Mar 18 11:01:31 crc kubenswrapper[4778]: I0318 11:01:31.870090 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerDied","Data":"8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984"} Mar 18 11:01:31 crc kubenswrapper[4778]: I0318 11:01:31.874665 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 11:01:32 crc kubenswrapper[4778]: I0318 11:01:32.883215 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerStarted","Data":"2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157"} Mar 18 11:01:34 crc kubenswrapper[4778]: I0318 11:01:34.904766 4778 generic.go:334] "Generic (PLEG): container finished" podID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerID="2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157" exitCode=0 Mar 18 11:01:34 crc kubenswrapper[4778]: I0318 11:01:34.904844 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerDied","Data":"2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157"} Mar 18 11:01:35 crc kubenswrapper[4778]: I0318 11:01:35.918932 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerStarted","Data":"70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0"} Mar 18 11:01:35 crc kubenswrapper[4778]: I0318 11:01:35.949637 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2hjj7" podStartSLOduration=3.294141401 podStartE2EDuration="6.949614355s" podCreationTimestamp="2026-03-18 11:01:29 +0000 UTC" firstStartedPulling="2026-03-18 11:01:31.874365411 +0000 UTC m=+7158.449110271" lastFinishedPulling="2026-03-18 11:01:35.529838385 +0000 UTC m=+7162.104583225" observedRunningTime="2026-03-18 11:01:35.938526874 +0000 UTC m=+7162.513271744" watchObservedRunningTime="2026-03-18 11:01:35.949614355 +0000 UTC m=+7162.524359205" Mar 18 11:01:40 crc kubenswrapper[4778]: I0318 11:01:40.140583 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:40 crc kubenswrapper[4778]: I0318 11:01:40.141355 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:41 crc kubenswrapper[4778]: I0318 11:01:41.185320 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-2hjj7" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="registry-server" probeResult="failure" output=< Mar 18 11:01:41 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 11:01:41 crc kubenswrapper[4778]: > Mar 18 11:01:46 crc kubenswrapper[4778]: E0318 11:01:46.190323 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:01:50 crc kubenswrapper[4778]: I0318 11:01:50.206529 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:50 crc kubenswrapper[4778]: I0318 11:01:50.269470 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:50 crc kubenswrapper[4778]: I0318 11:01:50.875340 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hjj7"] Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.109613 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2hjj7" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="registry-server" containerID="cri-o://70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0" gracePeriod=2 Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.621744 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.684940 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wxbd\" (UniqueName: \"kubernetes.io/projected/abcddae8-6cd1-4a48-b133-af298a8fc9bb-kube-api-access-5wxbd\") pod \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.685047 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-catalog-content\") pod \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.685168 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-utilities\") pod \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.685802 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-utilities" (OuterVolumeSpecName: "utilities") pod "abcddae8-6cd1-4a48-b133-af298a8fc9bb" (UID: "abcddae8-6cd1-4a48-b133-af298a8fc9bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.697388 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abcddae8-6cd1-4a48-b133-af298a8fc9bb-kube-api-access-5wxbd" (OuterVolumeSpecName: "kube-api-access-5wxbd") pod "abcddae8-6cd1-4a48-b133-af298a8fc9bb" (UID: "abcddae8-6cd1-4a48-b133-af298a8fc9bb"). InnerVolumeSpecName "kube-api-access-5wxbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.749130 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abcddae8-6cd1-4a48-b133-af298a8fc9bb" (UID: "abcddae8-6cd1-4a48-b133-af298a8fc9bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.787122 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.787169 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wxbd\" (UniqueName: \"kubernetes.io/projected/abcddae8-6cd1-4a48-b133-af298a8fc9bb-kube-api-access-5wxbd\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.787184 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.125847 4778 generic.go:334] "Generic (PLEG): container finished" podID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerID="70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0" exitCode=0 Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.125885 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerDied","Data":"70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0"} Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.125913 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerDied","Data":"6796f39f1936165c0d34446a4399a251eaff83374f66c18a58cb0c062de2237f"} Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.125932 4778 scope.go:117] "RemoveContainer" containerID="70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.125955 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.160363 4778 scope.go:117] "RemoveContainer" containerID="2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.195144 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hjj7"] Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.203715 4778 scope.go:117] "RemoveContainer" containerID="8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.207967 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2hjj7"] Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.234071 4778 scope.go:117] "RemoveContainer" containerID="70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0" Mar 18 11:01:53 crc kubenswrapper[4778]: E0318 11:01:53.234839 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0\": container with ID starting with 70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0 not found: ID does not exist" containerID="70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.234866 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0"} err="failed to get container status \"70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0\": rpc error: code = NotFound desc = could not find container \"70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0\": container with ID starting with 70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0 not found: ID does not exist" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.234891 4778 scope.go:117] "RemoveContainer" containerID="2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157" Mar 18 11:01:53 crc kubenswrapper[4778]: E0318 11:01:53.235231 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157\": container with ID starting with 2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157 not found: ID does not exist" containerID="2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.235275 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157"} err="failed to get container status \"2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157\": rpc error: code = NotFound desc = could not find container \"2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157\": container with ID starting with 2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157 not found: ID does not exist" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.235300 4778 scope.go:117] "RemoveContainer" containerID="8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984" Mar 18 11:01:53 crc kubenswrapper[4778]: E0318 11:01:53.235559 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984\": container with ID starting with 8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984 not found: ID does not exist" containerID="8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.235585 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984"} err="failed to get container status \"8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984\": rpc error: code = NotFound desc = could not find container \"8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984\": container with ID starting with 8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984 not found: ID does not exist" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.974712 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ansibletest-ansibletest_1fb58f5e-1c8b-45e2-bf86-b81af58b66a9/ansibletest-ansibletest/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.115115 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84d7458cd-cb86l_e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55/barbican-api/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.142033 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84d7458cd-cb86l_e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55/barbican-api-log/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.197644 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" path="/var/lib/kubelet/pods/abcddae8-6cd1-4a48-b133-af298a8fc9bb/volumes" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.273727 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8bc77f476-tw7vd_3a006670-1a48-4421-8471-dd961c0e1d4c/barbican-keystone-listener/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.468884 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-769d964c9f-nxhk2_eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b/barbican-worker/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.496646 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-769d964c9f-nxhk2_eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b/barbican-worker-log/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.696672 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk_f4bddd5e-314b-49c0-963c-107e6798c40e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.861472 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8bc77f476-tw7vd_3a006670-1a48-4421-8471-dd961c0e1d4c/barbican-keystone-listener-log/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.905299 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/ceilometer-central-agent/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.955386 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/ceilometer-notification-agent/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.978741 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/proxy-httpd/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.037922 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/sg-core/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.131850 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv_fed5a515-ed14-40f1-9282-4e87fe319bf6/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.251912 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl_34acd7f6-6263-4871-892c-02835ebbab27/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.451593 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51/cinder-api/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.508251 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51/cinder-api-log/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.696990 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a419ad60-27c7-4a74-a7a0-f6b04b3bcb13/cinder-backup/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.754270 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a419ad60-27c7-4a74-a7a0-f6b04b3bcb13/probe/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.764502 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_bbde13ad-dacc-4f17-8da3-109ede6972c0/cinder-scheduler/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.913514 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_bbde13ad-dacc-4f17-8da3-109ede6972c0/probe/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.997669 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_81d18509-d2fc-47e2-b814-94c4807a4dd6/probe/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.003009 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_81d18509-d2fc-47e2-b814-94c4807a4dd6/cinder-volume/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.245262 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5_d44d6afe-0030-4d9d-9fa7-f75274eff578/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.247425 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-r64nk_4f5bf2d2-78b2-4358-a582-482ab3020da3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.409138 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-pr7j8_78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3/init/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.643793 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-pr7j8_78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3/init/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.725188 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd/glance-httpd/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.828441 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-pr7j8_78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3/dnsmasq-dns/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.877180 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd/glance-log/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.053340 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a18a46b5-39a7-4da9-8994-5c4716bc0fc3/glance-log/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.059401 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a18a46b5-39a7-4da9-8994-5c4716bc0fc3/glance-httpd/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.199359 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-644f48df4-b7jhq_e0a0a638-c445-4931-861e-d35704487c97/horizon/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.391714 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizontest-tests-horizontest_49ff1200-d42e-4022-990d-619169f357f4/horizontest-tests-horizontest/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.603539 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gfphd_7c70009e-cfb3-4598-9ae4-f1d90a2a63d5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.748173 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-9zw82_5e5ffed6-fceb-4d38-aa29-e9836a8d9f50/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.986683 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29563801-nctwn_8ace9f11-f4d8-4801-afa2-5b723d52d41e/keystone-cron/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.176231 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29563861-czdsg_d34b9add-0199-4bf1-81f8-fa4c2a9138e7/keystone-cron/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.325744 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1663e1b0-f9b0-4168-9386-abf2c1b56b43/kube-state-metrics/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.378874 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-644f48df4-b7jhq_e0a0a638-c445-4931-861e-d35704487c97/horizon-log/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.584952 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr_d50b5540-c2ca-4889-bbb0-3b5d04bc602f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.604603 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_35adb68e-2fb0-437c-bea7-e46f05e4918c/manila-api-log/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.800060 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_35adb68e-2fb0-437c-bea7-e46f05e4918c/manila-api/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.830424 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e9af702d-3a1a-490e-82f5-e99c1718ef83/probe/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.981072 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e9af702d-3a1a-490e-82f5-e99c1718ef83/manila-scheduler/0.log" Mar 18 11:01:59 crc kubenswrapper[4778]: I0318 11:01:59.102688 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_821dda0e-cde2-45a4-b23a-3d13565be515/probe/0.log" Mar 18 11:01:59 crc kubenswrapper[4778]: I0318 11:01:59.166517 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_821dda0e-cde2-45a4-b23a-3d13565be515/manila-share/0.log" Mar 18 11:01:59 crc kubenswrapper[4778]: I0318 11:01:59.909143 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v_52250b90-fbc6-418e-9a5f-4873d5fa5cd0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.141368 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563862-flgc8"] Mar 18 11:02:00 crc kubenswrapper[4778]: E0318 11:02:00.141821 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="registry-server" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.141842 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="registry-server" Mar 18 11:02:00 crc kubenswrapper[4778]: E0318 11:02:00.141856 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="extract-utilities" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.141862 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="extract-utilities" Mar 18 11:02:00 crc kubenswrapper[4778]: E0318 11:02:00.141872 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="extract-content" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.141878 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="extract-content" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.142159 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="registry-server" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.143447 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.146380 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.146665 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.146790 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.147521 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.147552 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.147583 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.148239 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.148306 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" gracePeriod=600 Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.161938 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563862-flgc8"] Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.251362 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x749h\" (UniqueName: \"kubernetes.io/projected/0e9b6093-6849-4ee1-829c-3893c8efc355-kube-api-access-x749h\") pod \"auto-csr-approver-29563862-flgc8\" (UID: \"0e9b6093-6849-4ee1-829c-3893c8efc355\") " pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:00 crc kubenswrapper[4778]: E0318 11:02:00.301061 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.352801 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x749h\" (UniqueName: \"kubernetes.io/projected/0e9b6093-6849-4ee1-829c-3893c8efc355-kube-api-access-x749h\") pod \"auto-csr-approver-29563862-flgc8\" (UID: \"0e9b6093-6849-4ee1-829c-3893c8efc355\") " pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.384273 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x749h\" (UniqueName: \"kubernetes.io/projected/0e9b6093-6849-4ee1-829c-3893c8efc355-kube-api-access-x749h\") pod \"auto-csr-approver-29563862-flgc8\" (UID: \"0e9b6093-6849-4ee1-829c-3893c8efc355\") " pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.434839 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d979499f7-4flxt_da263057-3652-4ae8-8435-4f80e4b13804/neutron-httpd/0.log" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.462919 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.979350 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563862-flgc8"] Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.158134 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-75996d8fd4-jhtd2_4c045639-00d0-4ba6-9d75-c67934521e29/keystone-api/0.log" Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.214182 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563862-flgc8" event={"ID":"0e9b6093-6849-4ee1-829c-3893c8efc355","Type":"ContainerStarted","Data":"147cc97a1f98cfc31f25273d021eb620b9bf14cdb68020dd46c294882b45318b"} Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.217426 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" exitCode=0 Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.217480 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9"} Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.217804 4778 scope.go:117] "RemoveContainer" containerID="3a30fa73502544b7ae8a345c4023efae179b70a66379eb3e45aa01314f418265" Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.218623 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:02:01 crc kubenswrapper[4778]: E0318 11:02:01.218994 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.451060 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d979499f7-4flxt_da263057-3652-4ae8-8435-4f80e4b13804/neutron-api/0.log" Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.991533 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9ba2a389-4009-4dab-bc75-45a574e50bbc/nova-cell1-conductor-conductor/0.log" Mar 18 11:02:02 crc kubenswrapper[4778]: I0318 11:02:02.184982 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3fc908a0-dc90-4df9-869c-5c0820cac423/nova-cell0-conductor-conductor/0.log" Mar 18 11:02:02 crc kubenswrapper[4778]: I0318 11:02:02.587241 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9549b39b-0fc5-4e89-b64a-de83c80735ed/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 11:02:02 crc kubenswrapper[4778]: I0318 11:02:02.765238 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9_b2db5491-57b4-427a-b306-5e525a1e7c27/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:03 crc kubenswrapper[4778]: I0318 11:02:03.071274 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_28f01ca6-f7d2-4de3-9aa9-256803533b80/nova-metadata-log/0.log" Mar 18 11:02:03 crc kubenswrapper[4778]: I0318 11:02:03.254325 4778 generic.go:334] "Generic (PLEG): container finished" podID="0e9b6093-6849-4ee1-829c-3893c8efc355" containerID="3545c5320c999ca132cdbecfec3fe0adacffa5fbc99319fc9409f6ba39ed60ac" exitCode=0 Mar 18 11:02:03 crc kubenswrapper[4778]: I0318 11:02:03.254372 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563862-flgc8" event={"ID":"0e9b6093-6849-4ee1-829c-3893c8efc355","Type":"ContainerDied","Data":"3545c5320c999ca132cdbecfec3fe0adacffa5fbc99319fc9409f6ba39ed60ac"} Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.224190 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9b1623d1-2084-419e-b36a-80930113a280/nova-scheduler-scheduler/0.log" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.558172 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_28f01ca6-f7d2-4de3-9aa9-256803533b80/nova-metadata-metadata/0.log" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.694533 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.714145 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_49ce9560-3ee2-48d2-b016-a9feefb3a798/mysql-bootstrap/0.log" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.722965 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8a702c51-b7a6-4094-9d34-519102e1cf91/nova-api-log/0.log" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.828388 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x749h\" (UniqueName: \"kubernetes.io/projected/0e9b6093-6849-4ee1-829c-3893c8efc355-kube-api-access-x749h\") pod \"0e9b6093-6849-4ee1-829c-3893c8efc355\" (UID: \"0e9b6093-6849-4ee1-829c-3893c8efc355\") " Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.838429 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9b6093-6849-4ee1-829c-3893c8efc355-kube-api-access-x749h" (OuterVolumeSpecName: "kube-api-access-x749h") pod "0e9b6093-6849-4ee1-829c-3893c8efc355" (UID: "0e9b6093-6849-4ee1-829c-3893c8efc355"). InnerVolumeSpecName "kube-api-access-x749h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.932441 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x749h\" (UniqueName: \"kubernetes.io/projected/0e9b6093-6849-4ee1-829c-3893c8efc355-kube-api-access-x749h\") on node \"crc\" DevicePath \"\"" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.983402 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_49ce9560-3ee2-48d2-b016-a9feefb3a798/mysql-bootstrap/0.log" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.984038 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_49ce9560-3ee2-48d2-b016-a9feefb3a798/galera/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.171524 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cfadc08e-9e77-4b6f-be89-fc7c726e85b7/mysql-bootstrap/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.282922 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563862-flgc8" event={"ID":"0e9b6093-6849-4ee1-829c-3893c8efc355","Type":"ContainerDied","Data":"147cc97a1f98cfc31f25273d021eb620b9bf14cdb68020dd46c294882b45318b"} Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.282960 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="147cc97a1f98cfc31f25273d021eb620b9bf14cdb68020dd46c294882b45318b" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.282963 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.400240 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cfadc08e-9e77-4b6f-be89-fc7c726e85b7/mysql-bootstrap/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.451552 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cfadc08e-9e77-4b6f-be89-fc7c726e85b7/galera/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.556997 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fec302c3-e5fc-4019-b4f5-50de6bdde59f/openstackclient/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.696494 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-djmq6_f58533cf-4c57-4c3a-b772-e2a488298d7e/ovn-controller/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.764052 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563856-kvmq4"] Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.770801 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8a702c51-b7a6-4094-9d34-519102e1cf91/nova-api-api/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.773905 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563856-kvmq4"] Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.801918 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-2ldk7_2c6e8f7b-9b48-4814-9e73-fc9833c26cc9/openstack-network-exporter/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.026905 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovsdb-server-init/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.196240 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b69a324-153a-4262-92ea-62c8b9d5928e" path="/var/lib/kubelet/pods/0b69a324-153a-4262-92ea-62c8b9d5928e/volumes" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.205406 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovsdb-server/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.206356 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovsdb-server-init/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.213607 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovs-vswitchd/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.403474 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7jqhd_1f0f4177-ad12-4848-bbd7-39b004344cb3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.409358 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac3419bd-88ba-4b83-bd93-ad5638bc7fd0/openstack-network-exporter/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.472256 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac3419bd-88ba-4b83-bd93-ad5638bc7fd0/ovn-northd/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.612417 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_113a3fc7-40a1-46f9-b93f-01a34fcaf4aa/openstack-network-exporter/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.663913 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_113a3fc7-40a1-46f9-b93f-01a34fcaf4aa/ovsdbserver-nb/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.783209 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_495e34ad-2f4d-46de-95e9-37b34a35f2d2/openstack-network-exporter/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.853267 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_495e34ad-2f4d-46de-95e9-37b34a35f2d2/ovsdbserver-sb/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.106124 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f9428cb3-4fdf-4b01-9368-28b413ecf82f/setup-container/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.326431 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f9428cb3-4fdf-4b01-9368-28b413ecf82f/setup-container/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.336233 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f9428cb3-4fdf-4b01-9368-28b413ecf82f/rabbitmq/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.424967 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7588d8786-t6x7l_fe0de426-6927-42ea-8b29-8bc01c27fe69/placement-api/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.577784 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0191a745-1fe2-4a1c-b007-96525ad39787/setup-container/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.647240 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7588d8786-t6x7l_fe0de426-6927-42ea-8b29-8bc01c27fe69/placement-log/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.755441 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0191a745-1fe2-4a1c-b007-96525ad39787/setup-container/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.819024 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0191a745-1fe2-4a1c-b007-96525ad39787/rabbitmq/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.873648 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7_613d0a31-a371-4c66-8254-85a7cc864fd0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.069535 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx_136dbfab-32f1-40ee-b685-74411fbc06ba/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.085341 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-9bss8_80a8d263-9bba-4db0-928e-f633b4ad5314/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.257711 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-j74ts_53b18647-af19-457c-9543-2156c1ace738/ssh-known-hosts-edpm-deployment/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.403511 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s00-full_757e3758-d646-4267-8c4c-b5efb0dcf709/tempest-tests-tempest-tests-runner/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.527792 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s01-single-test_c5a7a532-f8c2-4741-9892-65047a4cb225/tempest-tests-tempest-tests-runner/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.578154 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-ansibletest-ansibletest-ansibletest_1f57757d-6483-4e1a-9a09-e63026f73e70/test-operator-logs-container/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.718963 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-horizontest-horizontest-tests-horizontest_3db5e33d-384f-4df3-bfb8-ba279b83f7e4/test-operator-logs-container/0.log" Mar 18 11:02:09 crc kubenswrapper[4778]: I0318 11:02:09.014048 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_fb176b71-d782-4b0d-963f-94acef50cf11/test-operator-logs-container/0.log" Mar 18 11:02:09 crc kubenswrapper[4778]: I0318 11:02:09.128409 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tobiko-tobiko-tests-tobiko_4e028d5e-666c-497c-949e-97860410ad74/test-operator-logs-container/0.log" Mar 18 11:02:09 crc kubenswrapper[4778]: I0318 11:02:09.171969 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s00-podified-functional_5c0d8cb1-d7bc-4694-ac54-e0a9f8312557/tobiko-tests-tobiko/0.log" Mar 18 11:02:09 crc kubenswrapper[4778]: I0318 11:02:09.335883 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s01-sanity_bd565818-8912-47ba-881f-f88011fa9b46/tobiko-tests-tobiko/0.log" Mar 18 11:02:09 crc kubenswrapper[4778]: I0318 11:02:09.438426 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-44vc9_5e5ecb95-ba90-4f70-ae42-63e71026ffef/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:10 crc kubenswrapper[4778]: I0318 11:02:10.755160 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_fc50d224-cd65-4a46-b3d0-b40acdbda53d/memcached/0.log" Mar 18 11:02:14 crc kubenswrapper[4778]: I0318 11:02:14.197054 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:02:14 crc kubenswrapper[4778]: E0318 11:02:14.197808 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:02:28 crc kubenswrapper[4778]: I0318 11:02:28.186795 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:02:28 crc kubenswrapper[4778]: E0318 11:02:28.187693 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:02:33 crc kubenswrapper[4778]: I0318 11:02:33.418133 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-fsxlt_3390909b-6271-40dd-9662-0710f6866143/manager/0.log" Mar 18 11:02:33 crc kubenswrapper[4778]: I0318 11:02:33.656061 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-7mbx2_710ababb-0bee-441d-8dd0-e6a72ea2b2e3/manager/0.log" Mar 18 11:02:33 crc kubenswrapper[4778]: I0318 11:02:33.854176 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/util/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.100642 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/pull/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.107341 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/util/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.176258 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/pull/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.319539 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/pull/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.325155 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/util/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.397698 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/extract/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.637806 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-wb4pc_b41dbd4a-33dd-4dca-9356-34c740e8063f/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.025407 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-t5c4w_aceb2f7b-585f-451a-83b8-e673965ada87/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.077330 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-wxftc_0526f654-9ddc-4495-bb04-be13e53b6a1b/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.126925 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-x7rnp_124dc549-cb2a-4b1c-a610-093cf9b8c05d/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.297155 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-fjjvl_3c86f76c-1617-45e9-9573-f6fd51803b45/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.567232 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-5xvtc_e1ec7bae-8e15-4844-84d2-ff5951d0be31/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.578963 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-64c4x_66d3bf3a-086c-4340-ba73-209f526fc33c/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.648097 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-zpc92_211c991a-9406-4360-aa7f-830be3aa55db/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.782789 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-47sbc_37675366-70a8-4e0b-b92b-f7055547d918/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.875011 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-k4r2p_ae690990-eeb1-4871-8c51-dd3b547e1193/manager/0.log" Mar 18 11:02:36 crc kubenswrapper[4778]: I0318 11:02:36.071621 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-pzjdt_c776af1e-ad54-40fe-9bed-a0a09ce0eea7/manager/0.log" Mar 18 11:02:36 crc kubenswrapper[4778]: I0318 11:02:36.084739 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-h6whs_e245908e-e35e-403c-93f6-48371904ae42/manager/0.log" Mar 18 11:02:36 crc kubenswrapper[4778]: I0318 11:02:36.248153 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-xdgmv_80822932-2943-4f81-9436-1553ed031359/manager/0.log" Mar 18 11:02:36 crc kubenswrapper[4778]: I0318 11:02:36.378314 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-654f4fc7f7-9d4pb_b8267dff-2541-481e-bc64-13eb8d19300b/operator/0.log" Mar 18 11:02:36 crc kubenswrapper[4778]: I0318 11:02:36.582919 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-v7qxm_c508c810-232f-48c1-8d15-bbbb118d2948/registry-server/0.log" Mar 18 11:02:36 crc kubenswrapper[4778]: I0318 11:02:36.848716 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-fgfk9_208b26f2-3c91-4966-9d01-8fe73e4a7d87/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.007840 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-d5w9q_2f8e8860-00a1-43fc-9776-c617f270cc50/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.107679 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5jrv8_b837636e-8c09-42b7-9a81-e7875df68344/operator/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.326299 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-c6l5k_8ccabb3b-da59-4ab0-89c8-99094a939f0d/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.542379 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-tx9zq_9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.715733 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-54c5f5bc8-jsm76_99adb6be-2a3e-4148-8074-9258222ebd60/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.790499 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-f5c7df4d7-m4kvr_3c7e3158-5139-467d-b33c-808747f0d9be/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.862336 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-sgs49_57277339-c9be-4de1-8e35-72ae98d33905/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.911687 4778 scope.go:117] "RemoveContainer" containerID="82c47033c6d17fb0d1f1f077c5ae48584be4ec251f8c624e7bed8591ae05dffd" Mar 18 11:02:40 crc kubenswrapper[4778]: I0318 11:02:40.188061 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:02:40 crc kubenswrapper[4778]: E0318 11:02:40.188999 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:02:51 crc kubenswrapper[4778]: I0318 11:02:51.186855 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:02:51 crc kubenswrapper[4778]: E0318 11:02:51.187741 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:02:56 crc kubenswrapper[4778]: I0318 11:02:56.210986 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qtggn_ba84f396-0169-4d5e-a126-60ac9d6d49f8/control-plane-machine-set-operator/0.log" Mar 18 11:02:56 crc kubenswrapper[4778]: I0318 11:02:56.408401 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-smtz9_f06790e0-cf8c-48f0-8d48-893663fdbd1c/kube-rbac-proxy/0.log" Mar 18 11:02:56 crc kubenswrapper[4778]: I0318 11:02:56.419960 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-smtz9_f06790e0-cf8c-48f0-8d48-893663fdbd1c/machine-api-operator/0.log" Mar 18 11:03:03 crc kubenswrapper[4778]: I0318 11:03:03.188034 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:03:03 crc kubenswrapper[4778]: E0318 11:03:03.191586 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:03:03 crc kubenswrapper[4778]: I0318 11:03:03.452604 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sdl4q"] Mar 18 11:03:03 crc kubenswrapper[4778]: E0318 11:03:03.453217 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9b6093-6849-4ee1-829c-3893c8efc355" containerName="oc" Mar 18 11:03:03 crc kubenswrapper[4778]: I0318 11:03:03.453244 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9b6093-6849-4ee1-829c-3893c8efc355" containerName="oc" Mar 18 11:03:03 crc kubenswrapper[4778]: I0318 11:03:03.453599 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9b6093-6849-4ee1-829c-3893c8efc355" containerName="oc" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.455855 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.466924 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sdl4q"] Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.608373 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-catalog-content\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.608646 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-utilities\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.609598 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tsld\" (UniqueName: \"kubernetes.io/projected/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-kube-api-access-8tsld\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.711333 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-utilities\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.711489 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tsld\" (UniqueName: \"kubernetes.io/projected/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-kube-api-access-8tsld\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.711545 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-catalog-content\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.711882 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-utilities\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.711965 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-catalog-content\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.733997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tsld\" (UniqueName: \"kubernetes.io/projected/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-kube-api-access-8tsld\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.788020 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:04.344494 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sdl4q"] Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:04.840709 4778 generic.go:334] "Generic (PLEG): container finished" podID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerID="8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917" exitCode=0 Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:04.840821 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerDied","Data":"8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917"} Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:04.841230 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerStarted","Data":"b36d40a379b90ece19ca99233599298bef161851bf4285fc5a21ee921ddbd7a9"} Mar 18 11:03:05 crc kubenswrapper[4778]: I0318 11:03:05.850316 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerStarted","Data":"97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205"} Mar 18 11:03:06 crc kubenswrapper[4778]: I0318 11:03:06.860330 4778 generic.go:334] "Generic (PLEG): container finished" podID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerID="97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205" exitCode=0 Mar 18 11:03:06 crc kubenswrapper[4778]: I0318 11:03:06.860436 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerDied","Data":"97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205"} Mar 18 11:03:09 crc kubenswrapper[4778]: I0318 11:03:09.910426 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-qrqw4_e39be52c-c244-44cc-a707-0ec9994991fa/cert-manager-controller/0.log" Mar 18 11:03:10 crc kubenswrapper[4778]: I0318 11:03:10.031115 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-khqrg_24a88e8d-e986-4b3d-a77e-1a3e5162ac9c/cert-manager-cainjector/0.log" Mar 18 11:03:10 crc kubenswrapper[4778]: I0318 11:03:10.116487 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-hjskg_f09bc4b7-d305-4674-8540-283bd0b4901c/cert-manager-webhook/0.log" Mar 18 11:03:12 crc kubenswrapper[4778]: I0318 11:03:12.919747 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerStarted","Data":"0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c"} Mar 18 11:03:12 crc kubenswrapper[4778]: I0318 11:03:12.942224 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sdl4q" podStartSLOduration=2.366019728 podStartE2EDuration="9.942188711s" podCreationTimestamp="2026-03-18 11:03:03 +0000 UTC" firstStartedPulling="2026-03-18 11:03:04.842712569 +0000 UTC m=+7251.417457399" lastFinishedPulling="2026-03-18 11:03:12.418881532 +0000 UTC m=+7258.993626382" observedRunningTime="2026-03-18 11:03:12.93440427 +0000 UTC m=+7259.509149180" watchObservedRunningTime="2026-03-18 11:03:12.942188711 +0000 UTC m=+7259.516933551" Mar 18 11:03:13 crc kubenswrapper[4778]: I0318 11:03:13.788933 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:13 crc kubenswrapper[4778]: I0318 11:03:13.789382 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:14 crc kubenswrapper[4778]: I0318 11:03:14.837305 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-sdl4q" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="registry-server" probeResult="failure" output=< Mar 18 11:03:14 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 11:03:14 crc kubenswrapper[4778]: > Mar 18 11:03:15 crc kubenswrapper[4778]: E0318 11:03:15.188016 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:03:16 crc kubenswrapper[4778]: I0318 11:03:16.187798 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:03:16 crc kubenswrapper[4778]: E0318 11:03:16.188124 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.137736 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-22c9p_8b636ef7-4b85-4506-bb2a-f89bee9b028d/nmstate-console-plugin/0.log" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.293535 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5thsf_5b97fa25-4d3d-4664-a5fc-41c98bbd272f/nmstate-handler/0.log" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.316211 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wq8gr_71b50b27-6084-4693-acbc-d14f36759618/kube-rbac-proxy/0.log" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.325812 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wq8gr_71b50b27-6084-4693-acbc-d14f36759618/nmstate-metrics/0.log" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.480175 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-sr9ls_1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe/nmstate-operator/0.log" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.524647 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-thw7f_5961b98d-a41a-4ceb-bb71-4bf3a0fc854d/nmstate-webhook/0.log" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.837338 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.887356 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:24 crc kubenswrapper[4778]: I0318 11:03:24.074830 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sdl4q"] Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.027633 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sdl4q" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="registry-server" containerID="cri-o://0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c" gracePeriod=2 Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.544783 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.666959 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tsld\" (UniqueName: \"kubernetes.io/projected/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-kube-api-access-8tsld\") pod \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.667054 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-catalog-content\") pod \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.667317 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-utilities\") pod \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.667925 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-utilities" (OuterVolumeSpecName: "utilities") pod "6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" (UID: "6a9d0b8e-5cb6-43f2-87ce-09acac1ff898"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.678373 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-kube-api-access-8tsld" (OuterVolumeSpecName: "kube-api-access-8tsld") pod "6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" (UID: "6a9d0b8e-5cb6-43f2-87ce-09acac1ff898"). InnerVolumeSpecName "kube-api-access-8tsld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.726662 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" (UID: "6a9d0b8e-5cb6-43f2-87ce-09acac1ff898"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.769631 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.769665 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tsld\" (UniqueName: \"kubernetes.io/projected/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-kube-api-access-8tsld\") on node \"crc\" DevicePath \"\"" Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.769692 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.048616 4778 generic.go:334] "Generic (PLEG): container finished" podID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerID="0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c" exitCode=0 Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.048666 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerDied","Data":"0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c"} Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.048698 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerDied","Data":"b36d40a379b90ece19ca99233599298bef161851bf4285fc5a21ee921ddbd7a9"} Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.048722 4778 scope.go:117] "RemoveContainer" containerID="0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.048731 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.070999 4778 scope.go:117] "RemoveContainer" containerID="97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.102242 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sdl4q"] Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.124024 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sdl4q"] Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.132495 4778 scope.go:117] "RemoveContainer" containerID="8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.175645 4778 scope.go:117] "RemoveContainer" containerID="0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c" Mar 18 11:03:26 crc kubenswrapper[4778]: E0318 11:03:26.176155 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c\": container with ID starting with 0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c not found: ID does not exist" containerID="0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.176255 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c"} err="failed to get container status \"0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c\": rpc error: code = NotFound desc = could not find container \"0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c\": container with ID starting with 0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c not found: ID does not exist" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.176291 4778 scope.go:117] "RemoveContainer" containerID="97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205" Mar 18 11:03:26 crc kubenswrapper[4778]: E0318 11:03:26.176798 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205\": container with ID starting with 97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205 not found: ID does not exist" containerID="97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.176834 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205"} err="failed to get container status \"97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205\": rpc error: code = NotFound desc = could not find container \"97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205\": container with ID starting with 97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205 not found: ID does not exist" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.176860 4778 scope.go:117] "RemoveContainer" containerID="8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917" Mar 18 11:03:26 crc kubenswrapper[4778]: E0318 11:03:26.177273 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917\": container with ID starting with 8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917 not found: ID does not exist" containerID="8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.177294 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917"} err="failed to get container status \"8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917\": rpc error: code = NotFound desc = could not find container \"8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917\": container with ID starting with 8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917 not found: ID does not exist" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.198529 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" path="/var/lib/kubelet/pods/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898/volumes" Mar 18 11:03:28 crc kubenswrapper[4778]: I0318 11:03:28.186630 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:03:28 crc kubenswrapper[4778]: E0318 11:03:28.187170 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:03:41 crc kubenswrapper[4778]: I0318 11:03:41.186622 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:03:41 crc kubenswrapper[4778]: E0318 11:03:41.187381 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:03:52 crc kubenswrapper[4778]: I0318 11:03:52.602235 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-sv9kd_1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c/kube-rbac-proxy/0.log" Mar 18 11:03:52 crc kubenswrapper[4778]: I0318 11:03:52.778709 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-sv9kd_1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c/controller/0.log" Mar 18 11:03:52 crc kubenswrapper[4778]: I0318 11:03:52.841166 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.004325 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.006689 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.058763 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.125734 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.187627 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:03:53 crc kubenswrapper[4778]: E0318 11:03:53.188060 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.272341 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.281569 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.291753 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.317364 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.485359 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.514681 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.537076 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/controller/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.554139 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.714542 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/frr-metrics/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.754875 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/kube-rbac-proxy-frr/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.759007 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/kube-rbac-proxy/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.989184 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/reloader/0.log" Mar 18 11:03:54 crc kubenswrapper[4778]: I0318 11:03:54.013648 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jrtjv_0f18e9f0-b3eb-440a-b035-ed8256df5ed9/frr-k8s-webhook-server/0.log" Mar 18 11:03:54 crc kubenswrapper[4778]: I0318 11:03:54.402301 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-78856dcdc4-9cltx_721ee07f-fded-43ab-9bb7-2e4e56c98515/manager/0.log" Mar 18 11:03:54 crc kubenswrapper[4778]: I0318 11:03:54.586917 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b499db45c-c5tcr_75885bb8-adce-4801-8941-75042ab330ea/webhook-server/0.log" Mar 18 11:03:54 crc kubenswrapper[4778]: I0318 11:03:54.661181 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wd69x_1c97662e-d673-42c1-a6ad-75865ba2b8b6/kube-rbac-proxy/0.log" Mar 18 11:03:55 crc kubenswrapper[4778]: I0318 11:03:55.256645 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wd69x_1c97662e-d673-42c1-a6ad-75865ba2b8b6/speaker/0.log" Mar 18 11:03:55 crc kubenswrapper[4778]: I0318 11:03:55.955989 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/frr/0.log" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.169613 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563864-np9td"] Mar 18 11:04:00 crc kubenswrapper[4778]: E0318 11:04:00.170770 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="extract-content" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.170792 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="extract-content" Mar 18 11:04:00 crc kubenswrapper[4778]: E0318 11:04:00.170842 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="registry-server" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.170854 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="registry-server" Mar 18 11:04:00 crc kubenswrapper[4778]: E0318 11:04:00.170884 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="extract-utilities" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.170896 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="extract-utilities" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.171247 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="registry-server" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.172016 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.177710 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.178544 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.178549 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.184879 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563864-np9td"] Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.266697 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfp29\" (UniqueName: \"kubernetes.io/projected/93ef8df9-98a4-4897-918d-b573fc50f7bb-kube-api-access-tfp29\") pod \"auto-csr-approver-29563864-np9td\" (UID: \"93ef8df9-98a4-4897-918d-b573fc50f7bb\") " pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.368221 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfp29\" (UniqueName: \"kubernetes.io/projected/93ef8df9-98a4-4897-918d-b573fc50f7bb-kube-api-access-tfp29\") pod \"auto-csr-approver-29563864-np9td\" (UID: \"93ef8df9-98a4-4897-918d-b573fc50f7bb\") " pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.392814 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfp29\" (UniqueName: \"kubernetes.io/projected/93ef8df9-98a4-4897-918d-b573fc50f7bb-kube-api-access-tfp29\") pod \"auto-csr-approver-29563864-np9td\" (UID: \"93ef8df9-98a4-4897-918d-b573fc50f7bb\") " pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.510734 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.983328 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563864-np9td"] Mar 18 11:04:01 crc kubenswrapper[4778]: I0318 11:04:01.420525 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563864-np9td" event={"ID":"93ef8df9-98a4-4897-918d-b573fc50f7bb","Type":"ContainerStarted","Data":"0d92a478b867c5082f49cd2e833809c8ef4e6522acb91242c972c6f922ac9da2"} Mar 18 11:04:02 crc kubenswrapper[4778]: I0318 11:04:02.429114 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563864-np9td" event={"ID":"93ef8df9-98a4-4897-918d-b573fc50f7bb","Type":"ContainerStarted","Data":"806be4b53bf081091f4914e3382dec40c771f43c71215c8124342f2c0296cb37"} Mar 18 11:04:03 crc kubenswrapper[4778]: I0318 11:04:03.439428 4778 generic.go:334] "Generic (PLEG): container finished" podID="93ef8df9-98a4-4897-918d-b573fc50f7bb" containerID="806be4b53bf081091f4914e3382dec40c771f43c71215c8124342f2c0296cb37" exitCode=0 Mar 18 11:04:03 crc kubenswrapper[4778]: I0318 11:04:03.439534 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563864-np9td" event={"ID":"93ef8df9-98a4-4897-918d-b573fc50f7bb","Type":"ContainerDied","Data":"806be4b53bf081091f4914e3382dec40c771f43c71215c8124342f2c0296cb37"} Mar 18 11:04:04 crc kubenswrapper[4778]: I0318 11:04:04.783826 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:04 crc kubenswrapper[4778]: I0318 11:04:04.873873 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfp29\" (UniqueName: \"kubernetes.io/projected/93ef8df9-98a4-4897-918d-b573fc50f7bb-kube-api-access-tfp29\") pod \"93ef8df9-98a4-4897-918d-b573fc50f7bb\" (UID: \"93ef8df9-98a4-4897-918d-b573fc50f7bb\") " Mar 18 11:04:04 crc kubenswrapper[4778]: I0318 11:04:04.885489 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ef8df9-98a4-4897-918d-b573fc50f7bb-kube-api-access-tfp29" (OuterVolumeSpecName: "kube-api-access-tfp29") pod "93ef8df9-98a4-4897-918d-b573fc50f7bb" (UID: "93ef8df9-98a4-4897-918d-b573fc50f7bb"). InnerVolumeSpecName "kube-api-access-tfp29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:04:04 crc kubenswrapper[4778]: I0318 11:04:04.976933 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfp29\" (UniqueName: \"kubernetes.io/projected/93ef8df9-98a4-4897-918d-b573fc50f7bb-kube-api-access-tfp29\") on node \"crc\" DevicePath \"\"" Mar 18 11:04:05 crc kubenswrapper[4778]: I0318 11:04:05.463489 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563864-np9td" event={"ID":"93ef8df9-98a4-4897-918d-b573fc50f7bb","Type":"ContainerDied","Data":"0d92a478b867c5082f49cd2e833809c8ef4e6522acb91242c972c6f922ac9da2"} Mar 18 11:04:05 crc kubenswrapper[4778]: I0318 11:04:05.463530 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d92a478b867c5082f49cd2e833809c8ef4e6522acb91242c972c6f922ac9da2" Mar 18 11:04:05 crc kubenswrapper[4778]: I0318 11:04:05.463565 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:05 crc kubenswrapper[4778]: I0318 11:04:05.880309 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563858-mb4zj"] Mar 18 11:04:05 crc kubenswrapper[4778]: I0318 11:04:05.891485 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563858-mb4zj"] Mar 18 11:04:06 crc kubenswrapper[4778]: I0318 11:04:06.225872 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e4f7f22-f4dd-4291-b26b-1a54380c3851" path="/var/lib/kubelet/pods/9e4f7f22-f4dd-4291-b26b-1a54380c3851/volumes" Mar 18 11:04:07 crc kubenswrapper[4778]: I0318 11:04:07.188295 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:04:07 crc kubenswrapper[4778]: E0318 11:04:07.188701 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.090145 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/util/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.323396 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/util/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.366498 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/pull/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.387314 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/pull/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.535219 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/extract/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.560789 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/util/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.599604 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/pull/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.701448 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/util/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.884458 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/pull/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.893135 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/pull/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.915039 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/util/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.076785 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/extract/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.079049 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/pull/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.093725 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/util/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.248574 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-utilities/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.409133 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-utilities/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.425892 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-content/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.425902 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-content/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.664019 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-content/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.741544 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-utilities/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.939492 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-utilities/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.233191 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-content/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.273374 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-utilities/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.326967 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-content/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.518027 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-utilities/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.550403 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-content/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.734604 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/registry-server/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.776889 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jj774_e037e8cd-1543-49a8-9389-4cc6f440c4b3/marketplace-operator/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.940823 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-utilities/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.207787 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-content/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.292414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-content/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.298627 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-utilities/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.368101 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/registry-server/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.504603 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-content/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.533299 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-utilities/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.740594 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-utilities/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.821840 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/registry-server/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.949158 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-utilities/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.949239 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-content/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.949162 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-content/0.log" Mar 18 11:04:13 crc kubenswrapper[4778]: I0318 11:04:13.160947 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-utilities/0.log" Mar 18 11:04:13 crc kubenswrapper[4778]: I0318 11:04:13.198966 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-content/0.log" Mar 18 11:04:13 crc kubenswrapper[4778]: I0318 11:04:13.996901 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/registry-server/0.log" Mar 18 11:04:22 crc kubenswrapper[4778]: I0318 11:04:22.187428 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:04:22 crc kubenswrapper[4778]: E0318 11:04:22.188323 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:04:35 crc kubenswrapper[4778]: I0318 11:04:35.187381 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:04:35 crc kubenswrapper[4778]: E0318 11:04:35.187406 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:04:35 crc kubenswrapper[4778]: E0318 11:04:35.188302 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:04:38 crc kubenswrapper[4778]: I0318 11:04:38.053738 4778 scope.go:117] "RemoveContainer" containerID="301c31c55dd167d2c6c06a6c3d13b7a706f6ed65cd7e2a490dde753952b7fad3" Mar 18 11:04:47 crc kubenswrapper[4778]: I0318 11:04:47.187902 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:04:47 crc kubenswrapper[4778]: E0318 11:04:47.188799 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:04:58 crc kubenswrapper[4778]: I0318 11:04:58.191750 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:04:58 crc kubenswrapper[4778]: E0318 11:04:58.192936 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:05:09 crc kubenswrapper[4778]: I0318 11:05:09.187806 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:05:09 crc kubenswrapper[4778]: E0318 11:05:09.188561 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:05:24 crc kubenswrapper[4778]: I0318 11:05:24.207535 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:05:24 crc kubenswrapper[4778]: E0318 11:05:24.208880 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:05:36 crc kubenswrapper[4778]: E0318 11:05:36.186924 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:05:39 crc kubenswrapper[4778]: I0318 11:05:39.188821 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:05:39 crc kubenswrapper[4778]: E0318 11:05:39.189601 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:05:52 crc kubenswrapper[4778]: I0318 11:05:52.188245 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:05:52 crc kubenswrapper[4778]: E0318 11:05:52.189214 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.170857 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563866-8zx72"] Mar 18 11:06:00 crc kubenswrapper[4778]: E0318 11:06:00.171764 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ef8df9-98a4-4897-918d-b573fc50f7bb" containerName="oc" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.171776 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ef8df9-98a4-4897-918d-b573fc50f7bb" containerName="oc" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.171931 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ef8df9-98a4-4897-918d-b573fc50f7bb" containerName="oc" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.172595 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.178483 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.178826 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.179139 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.209618 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563866-8zx72"] Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.262342 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qvm2\" (UniqueName: \"kubernetes.io/projected/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c-kube-api-access-9qvm2\") pod \"auto-csr-approver-29563866-8zx72\" (UID: \"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c\") " pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.364528 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qvm2\" (UniqueName: \"kubernetes.io/projected/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c-kube-api-access-9qvm2\") pod \"auto-csr-approver-29563866-8zx72\" (UID: \"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c\") " pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.395983 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qvm2\" (UniqueName: \"kubernetes.io/projected/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c-kube-api-access-9qvm2\") pod \"auto-csr-approver-29563866-8zx72\" (UID: \"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c\") " pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.506621 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.994456 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563866-8zx72"] Mar 18 11:06:00 crc kubenswrapper[4778]: W0318 11:06:00.996455 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b734773_4a1f_4acd_80e9_e3cd0cf14c2c.slice/crio-aed187c552143de0bae81a1d26ef671cf9348f4b261a733568bda00e48b58d34 WatchSource:0}: Error finding container aed187c552143de0bae81a1d26ef671cf9348f4b261a733568bda00e48b58d34: Status 404 returned error can't find the container with id aed187c552143de0bae81a1d26ef671cf9348f4b261a733568bda00e48b58d34 Mar 18 11:06:01 crc kubenswrapper[4778]: I0318 11:06:01.638223 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563866-8zx72" event={"ID":"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c","Type":"ContainerStarted","Data":"aed187c552143de0bae81a1d26ef671cf9348f4b261a733568bda00e48b58d34"} Mar 18 11:06:02 crc kubenswrapper[4778]: I0318 11:06:02.656953 4778 generic.go:334] "Generic (PLEG): container finished" podID="6b734773-4a1f-4acd-80e9-e3cd0cf14c2c" containerID="9ce4ad858c60f25a18c86f0360777510f04c706cb5eafb4da8787fc9df1829e5" exitCode=0 Mar 18 11:06:02 crc kubenswrapper[4778]: I0318 11:06:02.657031 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563866-8zx72" event={"ID":"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c","Type":"ContainerDied","Data":"9ce4ad858c60f25a18c86f0360777510f04c706cb5eafb4da8787fc9df1829e5"} Mar 18 11:06:03 crc kubenswrapper[4778]: I0318 11:06:03.189368 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:06:03 crc kubenswrapper[4778]: E0318 11:06:03.189802 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.011019 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.177735 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qvm2\" (UniqueName: \"kubernetes.io/projected/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c-kube-api-access-9qvm2\") pod \"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c\" (UID: \"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c\") " Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.183564 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c-kube-api-access-9qvm2" (OuterVolumeSpecName: "kube-api-access-9qvm2") pod "6b734773-4a1f-4acd-80e9-e3cd0cf14c2c" (UID: "6b734773-4a1f-4acd-80e9-e3cd0cf14c2c"). InnerVolumeSpecName "kube-api-access-9qvm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.283429 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qvm2\" (UniqueName: \"kubernetes.io/projected/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c-kube-api-access-9qvm2\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.676559 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563866-8zx72" event={"ID":"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c","Type":"ContainerDied","Data":"aed187c552143de0bae81a1d26ef671cf9348f4b261a733568bda00e48b58d34"} Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.676635 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed187c552143de0bae81a1d26ef671cf9348f4b261a733568bda00e48b58d34" Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.676752 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:05 crc kubenswrapper[4778]: I0318 11:06:05.106981 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563860-9p79f"] Mar 18 11:06:05 crc kubenswrapper[4778]: I0318 11:06:05.116482 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563860-9p79f"] Mar 18 11:06:06 crc kubenswrapper[4778]: I0318 11:06:06.200000 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bbe37de-66b2-4c42-a72f-92155eb2edb9" path="/var/lib/kubelet/pods/9bbe37de-66b2-4c42-a72f-92155eb2edb9/volumes" Mar 18 11:06:18 crc kubenswrapper[4778]: I0318 11:06:18.188014 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:06:18 crc kubenswrapper[4778]: E0318 11:06:18.188931 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:06:29 crc kubenswrapper[4778]: I0318 11:06:29.187245 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:06:29 crc kubenswrapper[4778]: E0318 11:06:29.188006 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.475090 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k4rg6"] Mar 18 11:06:30 crc kubenswrapper[4778]: E0318 11:06:30.476091 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b734773-4a1f-4acd-80e9-e3cd0cf14c2c" containerName="oc" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.476112 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b734773-4a1f-4acd-80e9-e3cd0cf14c2c" containerName="oc" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.476501 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b734773-4a1f-4acd-80e9-e3cd0cf14c2c" containerName="oc" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.481612 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.506047 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4rg6"] Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.589269 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-catalog-content\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.589503 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9j59\" (UniqueName: \"kubernetes.io/projected/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-kube-api-access-d9j59\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.589748 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-utilities\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.691456 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-utilities\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.691650 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-catalog-content\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.691723 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9j59\" (UniqueName: \"kubernetes.io/projected/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-kube-api-access-d9j59\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.692099 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-utilities\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.692227 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-catalog-content\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.715210 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9j59\" (UniqueName: \"kubernetes.io/projected/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-kube-api-access-d9j59\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.822526 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:31 crc kubenswrapper[4778]: W0318 11:06:31.327562 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc748da1f_65e2_4349_86f3_bfcc90cc7d1c.slice/crio-49bfb1e3c7a826c25e45fd281397eec07f7f2e8c7d44afd520b898a0aed12985 WatchSource:0}: Error finding container 49bfb1e3c7a826c25e45fd281397eec07f7f2e8c7d44afd520b898a0aed12985: Status 404 returned error can't find the container with id 49bfb1e3c7a826c25e45fd281397eec07f7f2e8c7d44afd520b898a0aed12985 Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.349256 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4rg6"] Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.973551 4778 generic.go:334] "Generic (PLEG): container finished" podID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerID="e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3" exitCode=0 Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.973883 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerDied","Data":"e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3"} Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.974057 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerStarted","Data":"49bfb1e3c7a826c25e45fd281397eec07f7f2e8c7d44afd520b898a0aed12985"} Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.976870 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.978349 4778 generic.go:334] "Generic (PLEG): container finished" podID="2c2e2094-7c48-4653-8b53-95483d470344" containerID="8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa" exitCode=0 Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.978416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" event={"ID":"2c2e2094-7c48-4653-8b53-95483d470344","Type":"ContainerDied","Data":"8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa"} Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.979306 4778 scope.go:117] "RemoveContainer" containerID="8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa" Mar 18 11:06:32 crc kubenswrapper[4778]: I0318 11:06:32.609445 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n6w9k_must-gather-5mjwn_2c2e2094-7c48-4653-8b53-95483d470344/gather/0.log" Mar 18 11:06:34 crc kubenswrapper[4778]: I0318 11:06:34.011215 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerStarted","Data":"7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860"} Mar 18 11:06:35 crc kubenswrapper[4778]: I0318 11:06:35.031880 4778 generic.go:334] "Generic (PLEG): container finished" podID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerID="7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860" exitCode=0 Mar 18 11:06:35 crc kubenswrapper[4778]: I0318 11:06:35.031970 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerDied","Data":"7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860"} Mar 18 11:06:36 crc kubenswrapper[4778]: I0318 11:06:36.048002 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerStarted","Data":"dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328"} Mar 18 11:06:36 crc kubenswrapper[4778]: I0318 11:06:36.086706 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k4rg6" podStartSLOduration=2.505150333 podStartE2EDuration="6.086685151s" podCreationTimestamp="2026-03-18 11:06:30 +0000 UTC" firstStartedPulling="2026-03-18 11:06:31.976429128 +0000 UTC m=+7458.551173998" lastFinishedPulling="2026-03-18 11:06:35.557963966 +0000 UTC m=+7462.132708816" observedRunningTime="2026-03-18 11:06:36.072044474 +0000 UTC m=+7462.646789334" watchObservedRunningTime="2026-03-18 11:06:36.086685151 +0000 UTC m=+7462.661430001" Mar 18 11:06:38 crc kubenswrapper[4778]: I0318 11:06:38.170360 4778 scope.go:117] "RemoveContainer" containerID="1bb8821bd2c18d4bb5e7f9c4c0784d606dc27180e5e74bcaf381cd0d404e43fd" Mar 18 11:06:38 crc kubenswrapper[4778]: I0318 11:06:38.224535 4778 scope.go:117] "RemoveContainer" containerID="3fd6d1f0823c046b310eb8beb463813250215ced92cb6e72ad8093250484ff70" Mar 18 11:06:40 crc kubenswrapper[4778]: I0318 11:06:40.822678 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:40 crc kubenswrapper[4778]: I0318 11:06:40.823073 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:40 crc kubenswrapper[4778]: I0318 11:06:40.877011 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.013019 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n6w9k/must-gather-5mjwn"] Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.013333 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="copy" containerID="cri-o://e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec" gracePeriod=2 Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.026732 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n6w9k/must-gather-5mjwn"] Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.168370 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.218356 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4rg6"] Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.506146 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n6w9k_must-gather-5mjwn_2c2e2094-7c48-4653-8b53-95483d470344/copy/0.log" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.506706 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.680306 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zvb6\" (UniqueName: \"kubernetes.io/projected/2c2e2094-7c48-4653-8b53-95483d470344-kube-api-access-6zvb6\") pod \"2c2e2094-7c48-4653-8b53-95483d470344\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.680362 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c2e2094-7c48-4653-8b53-95483d470344-must-gather-output\") pod \"2c2e2094-7c48-4653-8b53-95483d470344\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.690970 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2e2094-7c48-4653-8b53-95483d470344-kube-api-access-6zvb6" (OuterVolumeSpecName: "kube-api-access-6zvb6") pod "2c2e2094-7c48-4653-8b53-95483d470344" (UID: "2c2e2094-7c48-4653-8b53-95483d470344"). InnerVolumeSpecName "kube-api-access-6zvb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.782702 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zvb6\" (UniqueName: \"kubernetes.io/projected/2c2e2094-7c48-4653-8b53-95483d470344-kube-api-access-6zvb6\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.880743 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2e2094-7c48-4653-8b53-95483d470344-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2c2e2094-7c48-4653-8b53-95483d470344" (UID: "2c2e2094-7c48-4653-8b53-95483d470344"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.885180 4778 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c2e2094-7c48-4653-8b53-95483d470344-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.113121 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n6w9k_must-gather-5mjwn_2c2e2094-7c48-4653-8b53-95483d470344/copy/0.log" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.115807 4778 generic.go:334] "Generic (PLEG): container finished" podID="2c2e2094-7c48-4653-8b53-95483d470344" containerID="e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec" exitCode=143 Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.115906 4778 scope.go:117] "RemoveContainer" containerID="e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.115867 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.148437 4778 scope.go:117] "RemoveContainer" containerID="8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.200611 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2e2094-7c48-4653-8b53-95483d470344" path="/var/lib/kubelet/pods/2c2e2094-7c48-4653-8b53-95483d470344/volumes" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.257272 4778 scope.go:117] "RemoveContainer" containerID="e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec" Mar 18 11:06:42 crc kubenswrapper[4778]: E0318 11:06:42.258330 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec\": container with ID starting with e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec not found: ID does not exist" containerID="e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.258366 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec"} err="failed to get container status \"e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec\": rpc error: code = NotFound desc = could not find container \"e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec\": container with ID starting with e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec not found: ID does not exist" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.258388 4778 scope.go:117] "RemoveContainer" containerID="8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa" Mar 18 11:06:42 crc kubenswrapper[4778]: E0318 11:06:42.258707 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa\": container with ID starting with 8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa not found: ID does not exist" containerID="8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.258726 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa"} err="failed to get container status \"8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa\": rpc error: code = NotFound desc = could not find container \"8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa\": container with ID starting with 8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa not found: ID does not exist" Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.129371 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k4rg6" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="registry-server" containerID="cri-o://dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328" gracePeriod=2 Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.745181 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.926975 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-catalog-content\") pod \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.927094 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-utilities\") pod \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.927265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9j59\" (UniqueName: \"kubernetes.io/projected/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-kube-api-access-d9j59\") pod \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.928127 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-utilities" (OuterVolumeSpecName: "utilities") pod "c748da1f-65e2-4349-86f3-bfcc90cc7d1c" (UID: "c748da1f-65e2-4349-86f3-bfcc90cc7d1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.942340 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-kube-api-access-d9j59" (OuterVolumeSpecName: "kube-api-access-d9j59") pod "c748da1f-65e2-4349-86f3-bfcc90cc7d1c" (UID: "c748da1f-65e2-4349-86f3-bfcc90cc7d1c"). InnerVolumeSpecName "kube-api-access-d9j59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.962389 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c748da1f-65e2-4349-86f3-bfcc90cc7d1c" (UID: "c748da1f-65e2-4349-86f3-bfcc90cc7d1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.029938 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9j59\" (UniqueName: \"kubernetes.io/projected/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-kube-api-access-d9j59\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.029978 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.029988 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.146554 4778 generic.go:334] "Generic (PLEG): container finished" podID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerID="dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328" exitCode=0 Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.146602 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerDied","Data":"dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328"} Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.146628 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.146642 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerDied","Data":"49bfb1e3c7a826c25e45fd281397eec07f7f2e8c7d44afd520b898a0aed12985"} Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.146718 4778 scope.go:117] "RemoveContainer" containerID="dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.173423 4778 scope.go:117] "RemoveContainer" containerID="7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.198111 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:06:44 crc kubenswrapper[4778]: E0318 11:06:44.198467 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.203595 4778 scope.go:117] "RemoveContainer" containerID="e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.218092 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4rg6"] Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.218143 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4rg6"] Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.260859 4778 scope.go:117] "RemoveContainer" containerID="dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328" Mar 18 11:06:44 crc kubenswrapper[4778]: E0318 11:06:44.261372 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328\": container with ID starting with dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328 not found: ID does not exist" containerID="dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.261427 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328"} err="failed to get container status \"dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328\": rpc error: code = NotFound desc = could not find container \"dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328\": container with ID starting with dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328 not found: ID does not exist" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.261458 4778 scope.go:117] "RemoveContainer" containerID="7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860" Mar 18 11:06:44 crc kubenswrapper[4778]: E0318 11:06:44.261762 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860\": container with ID starting with 7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860 not found: ID does not exist" containerID="7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.261794 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860"} err="failed to get container status \"7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860\": rpc error: code = NotFound desc = could not find container \"7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860\": container with ID starting with 7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860 not found: ID does not exist" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.261809 4778 scope.go:117] "RemoveContainer" containerID="e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3" Mar 18 11:06:44 crc kubenswrapper[4778]: E0318 11:06:44.262018 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3\": container with ID starting with e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3 not found: ID does not exist" containerID="e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.262046 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3"} err="failed to get container status \"e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3\": rpc error: code = NotFound desc = could not find container \"e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3\": container with ID starting with e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3 not found: ID does not exist" Mar 18 11:06:46 crc kubenswrapper[4778]: I0318 11:06:46.199672 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" path="/var/lib/kubelet/pods/c748da1f-65e2-4349-86f3-bfcc90cc7d1c/volumes" Mar 18 11:06:54 crc kubenswrapper[4778]: E0318 11:06:54.193767 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:06:56 crc kubenswrapper[4778]: I0318 11:06:56.187288 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:06:56 crc kubenswrapper[4778]: E0318 11:06:56.187938 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:07:08 crc kubenswrapper[4778]: I0318 11:07:08.187617 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:07:08 crc kubenswrapper[4778]: I0318 11:07:08.479980 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"cbe46d0fa65f0ead7ad1789d13ec33e6d93d408d865f08e3bbc666ffb661db35"} Mar 18 11:07:38 crc kubenswrapper[4778]: I0318 11:07:38.304199 4778 scope.go:117] "RemoveContainer" containerID="4885ccf444eddc8a6ce122274d5f1476536fe5db0e3a880ac80a8141660c536e" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.158564 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563868-txd2q"] Mar 18 11:08:00 crc kubenswrapper[4778]: E0318 11:08:00.159982 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="extract-content" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.159997 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="extract-content" Mar 18 11:08:00 crc kubenswrapper[4778]: E0318 11:08:00.160014 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="registry-server" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160021 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="registry-server" Mar 18 11:08:00 crc kubenswrapper[4778]: E0318 11:08:00.160031 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="extract-utilities" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160039 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="extract-utilities" Mar 18 11:08:00 crc kubenswrapper[4778]: E0318 11:08:00.160060 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="gather" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160066 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="gather" Mar 18 11:08:00 crc kubenswrapper[4778]: E0318 11:08:00.160099 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="copy" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160105 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="copy" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160334 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="registry-server" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160347 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="gather" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160360 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="copy" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.161215 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.163876 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.163970 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.164799 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.170857 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563868-txd2q"] Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.268544 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w4fz\" (UniqueName: \"kubernetes.io/projected/48c91868-ef15-4d6d-8547-1b2849d7aa95-kube-api-access-4w4fz\") pod \"auto-csr-approver-29563868-txd2q\" (UID: \"48c91868-ef15-4d6d-8547-1b2849d7aa95\") " pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.372171 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w4fz\" (UniqueName: \"kubernetes.io/projected/48c91868-ef15-4d6d-8547-1b2849d7aa95-kube-api-access-4w4fz\") pod \"auto-csr-approver-29563868-txd2q\" (UID: \"48c91868-ef15-4d6d-8547-1b2849d7aa95\") " pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.397148 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w4fz\" (UniqueName: \"kubernetes.io/projected/48c91868-ef15-4d6d-8547-1b2849d7aa95-kube-api-access-4w4fz\") pod \"auto-csr-approver-29563868-txd2q\" (UID: \"48c91868-ef15-4d6d-8547-1b2849d7aa95\") " pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.490534 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.982470 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563868-txd2q"] Mar 18 11:08:01 crc kubenswrapper[4778]: I0318 11:08:01.062343 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563868-txd2q" event={"ID":"48c91868-ef15-4d6d-8547-1b2849d7aa95","Type":"ContainerStarted","Data":"741f413682316d7f270de9e47aff7afe5389dd5b05acec7112518188196cc08f"} Mar 18 11:08:03 crc kubenswrapper[4778]: I0318 11:08:03.083906 4778 generic.go:334] "Generic (PLEG): container finished" podID="48c91868-ef15-4d6d-8547-1b2849d7aa95" containerID="ecc2f8a6686d5391d07b662f53f7a3bdd9927adf67509a229601871555c0b456" exitCode=0 Mar 18 11:08:03 crc kubenswrapper[4778]: I0318 11:08:03.084131 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563868-txd2q" event={"ID":"48c91868-ef15-4d6d-8547-1b2849d7aa95","Type":"ContainerDied","Data":"ecc2f8a6686d5391d07b662f53f7a3bdd9927adf67509a229601871555c0b456"} Mar 18 11:08:03 crc kubenswrapper[4778]: E0318 11:08:03.188025 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:08:04 crc kubenswrapper[4778]: I0318 11:08:04.573977 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:04 crc kubenswrapper[4778]: I0318 11:08:04.680614 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w4fz\" (UniqueName: \"kubernetes.io/projected/48c91868-ef15-4d6d-8547-1b2849d7aa95-kube-api-access-4w4fz\") pod \"48c91868-ef15-4d6d-8547-1b2849d7aa95\" (UID: \"48c91868-ef15-4d6d-8547-1b2849d7aa95\") " Mar 18 11:08:04 crc kubenswrapper[4778]: I0318 11:08:04.690795 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c91868-ef15-4d6d-8547-1b2849d7aa95-kube-api-access-4w4fz" (OuterVolumeSpecName: "kube-api-access-4w4fz") pod "48c91868-ef15-4d6d-8547-1b2849d7aa95" (UID: "48c91868-ef15-4d6d-8547-1b2849d7aa95"). InnerVolumeSpecName "kube-api-access-4w4fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:08:04 crc kubenswrapper[4778]: I0318 11:08:04.783060 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w4fz\" (UniqueName: \"kubernetes.io/projected/48c91868-ef15-4d6d-8547-1b2849d7aa95-kube-api-access-4w4fz\") on node \"crc\" DevicePath \"\"" Mar 18 11:08:05 crc kubenswrapper[4778]: I0318 11:08:05.124440 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563868-txd2q" event={"ID":"48c91868-ef15-4d6d-8547-1b2849d7aa95","Type":"ContainerDied","Data":"741f413682316d7f270de9e47aff7afe5389dd5b05acec7112518188196cc08f"} Mar 18 11:08:05 crc kubenswrapper[4778]: I0318 11:08:05.124805 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="741f413682316d7f270de9e47aff7afe5389dd5b05acec7112518188196cc08f" Mar 18 11:08:05 crc kubenswrapper[4778]: I0318 11:08:05.124870 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:05 crc kubenswrapper[4778]: E0318 11:08:05.366334 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c91868_ef15_4d6d_8547_1b2849d7aa95.slice/crio-741f413682316d7f270de9e47aff7afe5389dd5b05acec7112518188196cc08f\": RecentStats: unable to find data in memory cache]" Mar 18 11:08:05 crc kubenswrapper[4778]: I0318 11:08:05.642552 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563862-flgc8"] Mar 18 11:08:05 crc kubenswrapper[4778]: I0318 11:08:05.653277 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563862-flgc8"] Mar 18 11:08:06 crc kubenswrapper[4778]: I0318 11:08:06.196541 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9b6093-6849-4ee1-829c-3893c8efc355" path="/var/lib/kubelet/pods/0e9b6093-6849-4ee1-829c-3893c8efc355/volumes" Mar 18 11:08:38 crc kubenswrapper[4778]: I0318 11:08:38.469533 4778 scope.go:117] "RemoveContainer" containerID="3545c5320c999ca132cdbecfec3fe0adacffa5fbc99319fc9409f6ba39ed60ac" Mar 18 11:09:30 crc kubenswrapper[4778]: I0318 11:09:30.147455 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:09:30 crc kubenswrapper[4778]: I0318 11:09:30.147983 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:09:33 crc kubenswrapper[4778]: E0318 11:09:33.187576 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.694166 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4v7jl/must-gather-8j576"] Mar 18 11:09:56 crc kubenswrapper[4778]: E0318 11:09:56.696296 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c91868-ef15-4d6d-8547-1b2849d7aa95" containerName="oc" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.696421 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c91868-ef15-4d6d-8547-1b2849d7aa95" containerName="oc" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.696743 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c91868-ef15-4d6d-8547-1b2849d7aa95" containerName="oc" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.698129 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.703382 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4v7jl"/"kube-root-ca.crt" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.703408 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4v7jl"/"openshift-service-ca.crt" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.703398 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4v7jl"/"default-dockercfg-5dhft" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.710481 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4v7jl/must-gather-8j576"] Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.876570 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/339d23a2-4cea-4331-b745-44219b471d41-must-gather-output\") pod \"must-gather-8j576\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.876644 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjcph\" (UniqueName: \"kubernetes.io/projected/339d23a2-4cea-4331-b745-44219b471d41-kube-api-access-cjcph\") pod \"must-gather-8j576\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.978972 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/339d23a2-4cea-4331-b745-44219b471d41-must-gather-output\") pod \"must-gather-8j576\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.979071 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjcph\" (UniqueName: \"kubernetes.io/projected/339d23a2-4cea-4331-b745-44219b471d41-kube-api-access-cjcph\") pod \"must-gather-8j576\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.979496 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/339d23a2-4cea-4331-b745-44219b471d41-must-gather-output\") pod \"must-gather-8j576\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:57 crc kubenswrapper[4778]: I0318 11:09:57.010132 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjcph\" (UniqueName: \"kubernetes.io/projected/339d23a2-4cea-4331-b745-44219b471d41-kube-api-access-cjcph\") pod \"must-gather-8j576\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:57 crc kubenswrapper[4778]: I0318 11:09:57.019175 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:57 crc kubenswrapper[4778]: I0318 11:09:57.514711 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4v7jl/must-gather-8j576"] Mar 18 11:09:57 crc kubenswrapper[4778]: W0318 11:09:57.518463 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod339d23a2_4cea_4331_b745_44219b471d41.slice/crio-5bb4383e2e401a2463e2f6aeaf7fe15a8745ae6c18fb2b0acf1409a2e0af8ff0 WatchSource:0}: Error finding container 5bb4383e2e401a2463e2f6aeaf7fe15a8745ae6c18fb2b0acf1409a2e0af8ff0: Status 404 returned error can't find the container with id 5bb4383e2e401a2463e2f6aeaf7fe15a8745ae6c18fb2b0acf1409a2e0af8ff0 Mar 18 11:09:57 crc kubenswrapper[4778]: I0318 11:09:57.657353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/must-gather-8j576" event={"ID":"339d23a2-4cea-4331-b745-44219b471d41","Type":"ContainerStarted","Data":"5bb4383e2e401a2463e2f6aeaf7fe15a8745ae6c18fb2b0acf1409a2e0af8ff0"} Mar 18 11:09:58 crc kubenswrapper[4778]: I0318 11:09:58.667273 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/must-gather-8j576" event={"ID":"339d23a2-4cea-4331-b745-44219b471d41","Type":"ContainerStarted","Data":"07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310"} Mar 18 11:09:58 crc kubenswrapper[4778]: I0318 11:09:58.667575 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/must-gather-8j576" event={"ID":"339d23a2-4cea-4331-b745-44219b471d41","Type":"ContainerStarted","Data":"376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0"} Mar 18 11:09:58 crc kubenswrapper[4778]: I0318 11:09:58.687888 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4v7jl/must-gather-8j576" podStartSLOduration=2.6878673060000002 podStartE2EDuration="2.687867306s" podCreationTimestamp="2026-03-18 11:09:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:09:58.682682015 +0000 UTC m=+7665.257426875" watchObservedRunningTime="2026-03-18 11:09:58.687867306 +0000 UTC m=+7665.262612146" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.147365 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.147942 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.153078 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563870-s7lhp"] Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.158138 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.162598 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.162770 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.162933 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.163451 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563870-s7lhp"] Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.249257 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdld7\" (UniqueName: \"kubernetes.io/projected/08809b1c-c749-4734-9fc4-6a0a755aa9cd-kube-api-access-sdld7\") pod \"auto-csr-approver-29563870-s7lhp\" (UID: \"08809b1c-c749-4734-9fc4-6a0a755aa9cd\") " pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.351658 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdld7\" (UniqueName: \"kubernetes.io/projected/08809b1c-c749-4734-9fc4-6a0a755aa9cd-kube-api-access-sdld7\") pod \"auto-csr-approver-29563870-s7lhp\" (UID: \"08809b1c-c749-4734-9fc4-6a0a755aa9cd\") " pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.376530 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdld7\" (UniqueName: \"kubernetes.io/projected/08809b1c-c749-4734-9fc4-6a0a755aa9cd-kube-api-access-sdld7\") pod \"auto-csr-approver-29563870-s7lhp\" (UID: \"08809b1c-c749-4734-9fc4-6a0a755aa9cd\") " pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.539703 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.993032 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563870-s7lhp"] Mar 18 11:10:01 crc kubenswrapper[4778]: I0318 11:10:01.696293 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" event={"ID":"08809b1c-c749-4734-9fc4-6a0a755aa9cd","Type":"ContainerStarted","Data":"e0f1b0864e9fe4a32c395a265fea54f835e4d107f28e0a2e27d19f71a2fefbd4"} Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.047824 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-smwlp"] Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.048939 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.184402 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m6bc\" (UniqueName: \"kubernetes.io/projected/939aac29-edd6-4d03-a4f5-59541aa99ecd-kube-api-access-5m6bc\") pod \"crc-debug-smwlp\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.184742 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939aac29-edd6-4d03-a4f5-59541aa99ecd-host\") pod \"crc-debug-smwlp\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.287424 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6bc\" (UniqueName: \"kubernetes.io/projected/939aac29-edd6-4d03-a4f5-59541aa99ecd-kube-api-access-5m6bc\") pod \"crc-debug-smwlp\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.287501 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939aac29-edd6-4d03-a4f5-59541aa99ecd-host\") pod \"crc-debug-smwlp\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.288245 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939aac29-edd6-4d03-a4f5-59541aa99ecd-host\") pod \"crc-debug-smwlp\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.309671 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m6bc\" (UniqueName: \"kubernetes.io/projected/939aac29-edd6-4d03-a4f5-59541aa99ecd-kube-api-access-5m6bc\") pod \"crc-debug-smwlp\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.374576 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: W0318 11:10:02.421176 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod939aac29_edd6_4d03_a4f5_59541aa99ecd.slice/crio-30a87942c42ba58ae4535480008ce722d42f6808c3271d2c4c30afa2f5a8a212 WatchSource:0}: Error finding container 30a87942c42ba58ae4535480008ce722d42f6808c3271d2c4c30afa2f5a8a212: Status 404 returned error can't find the container with id 30a87942c42ba58ae4535480008ce722d42f6808c3271d2c4c30afa2f5a8a212 Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.707015 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" event={"ID":"08809b1c-c749-4734-9fc4-6a0a755aa9cd","Type":"ContainerStarted","Data":"7417bcdf486fe4210bba0dca5e997eafe86b7f08ceaf37548fb4760f00212acc"} Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.708664 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" event={"ID":"939aac29-edd6-4d03-a4f5-59541aa99ecd","Type":"ContainerStarted","Data":"b0c173a65daa3d6727011d9a85b81569efd5870086c307d8ad02c5186b648e01"} Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.708696 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" event={"ID":"939aac29-edd6-4d03-a4f5-59541aa99ecd","Type":"ContainerStarted","Data":"30a87942c42ba58ae4535480008ce722d42f6808c3271d2c4c30afa2f5a8a212"} Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.729392 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" podStartSLOduration=1.6047213980000001 podStartE2EDuration="2.729370004s" podCreationTimestamp="2026-03-18 11:10:00 +0000 UTC" firstStartedPulling="2026-03-18 11:10:00.998791648 +0000 UTC m=+7667.573536488" lastFinishedPulling="2026-03-18 11:10:02.123440234 +0000 UTC m=+7668.698185094" observedRunningTime="2026-03-18 11:10:02.723124045 +0000 UTC m=+7669.297868925" watchObservedRunningTime="2026-03-18 11:10:02.729370004 +0000 UTC m=+7669.304114844" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.742860 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" podStartSLOduration=0.74284247 podStartE2EDuration="742.84247ms" podCreationTimestamp="2026-03-18 11:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:10:02.735434448 +0000 UTC m=+7669.310179288" watchObservedRunningTime="2026-03-18 11:10:02.74284247 +0000 UTC m=+7669.317587310" Mar 18 11:10:03 crc kubenswrapper[4778]: I0318 11:10:03.718997 4778 generic.go:334] "Generic (PLEG): container finished" podID="08809b1c-c749-4734-9fc4-6a0a755aa9cd" containerID="7417bcdf486fe4210bba0dca5e997eafe86b7f08ceaf37548fb4760f00212acc" exitCode=0 Mar 18 11:10:03 crc kubenswrapper[4778]: I0318 11:10:03.719049 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" event={"ID":"08809b1c-c749-4734-9fc4-6a0a755aa9cd","Type":"ContainerDied","Data":"7417bcdf486fe4210bba0dca5e997eafe86b7f08ceaf37548fb4760f00212acc"} Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.251525 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.354880 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdld7\" (UniqueName: \"kubernetes.io/projected/08809b1c-c749-4734-9fc4-6a0a755aa9cd-kube-api-access-sdld7\") pod \"08809b1c-c749-4734-9fc4-6a0a755aa9cd\" (UID: \"08809b1c-c749-4734-9fc4-6a0a755aa9cd\") " Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.375059 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08809b1c-c749-4734-9fc4-6a0a755aa9cd-kube-api-access-sdld7" (OuterVolumeSpecName: "kube-api-access-sdld7") pod "08809b1c-c749-4734-9fc4-6a0a755aa9cd" (UID: "08809b1c-c749-4734-9fc4-6a0a755aa9cd"). InnerVolumeSpecName "kube-api-access-sdld7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.459107 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdld7\" (UniqueName: \"kubernetes.io/projected/08809b1c-c749-4734-9fc4-6a0a755aa9cd-kube-api-access-sdld7\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.746858 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" event={"ID":"08809b1c-c749-4734-9fc4-6a0a755aa9cd","Type":"ContainerDied","Data":"e0f1b0864e9fe4a32c395a265fea54f835e4d107f28e0a2e27d19f71a2fefbd4"} Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.747195 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0f1b0864e9fe4a32c395a265fea54f835e4d107f28e0a2e27d19f71a2fefbd4" Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.747110 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.806598 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563864-np9td"] Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.819832 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563864-np9td"] Mar 18 11:10:06 crc kubenswrapper[4778]: I0318 11:10:06.200049 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ef8df9-98a4-4897-918d-b573fc50f7bb" path="/var/lib/kubelet/pods/93ef8df9-98a4-4897-918d-b573fc50f7bb/volumes" Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.147856 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.148504 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.148554 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.149416 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cbe46d0fa65f0ead7ad1789d13ec33e6d93d408d865f08e3bbc666ffb661db35"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.149476 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://cbe46d0fa65f0ead7ad1789d13ec33e6d93d408d865f08e3bbc666ffb661db35" gracePeriod=600 Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.972983 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="cbe46d0fa65f0ead7ad1789d13ec33e6d93d408d865f08e3bbc666ffb661db35" exitCode=0 Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.973059 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"cbe46d0fa65f0ead7ad1789d13ec33e6d93d408d865f08e3bbc666ffb661db35"} Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.973547 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434"} Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.973572 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:10:38 crc kubenswrapper[4778]: I0318 11:10:38.596033 4778 scope.go:117] "RemoveContainer" containerID="806be4b53bf081091f4914e3382dec40c771f43c71215c8124342f2c0296cb37" Mar 18 11:10:44 crc kubenswrapper[4778]: I0318 11:10:44.079888 4778 generic.go:334] "Generic (PLEG): container finished" podID="939aac29-edd6-4d03-a4f5-59541aa99ecd" containerID="b0c173a65daa3d6727011d9a85b81569efd5870086c307d8ad02c5186b648e01" exitCode=0 Mar 18 11:10:44 crc kubenswrapper[4778]: I0318 11:10:44.079972 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" event={"ID":"939aac29-edd6-4d03-a4f5-59541aa99ecd","Type":"ContainerDied","Data":"b0c173a65daa3d6727011d9a85b81569efd5870086c307d8ad02c5186b648e01"} Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.189779 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.237443 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-smwlp"] Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.245235 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-smwlp"] Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.301077 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m6bc\" (UniqueName: \"kubernetes.io/projected/939aac29-edd6-4d03-a4f5-59541aa99ecd-kube-api-access-5m6bc\") pod \"939aac29-edd6-4d03-a4f5-59541aa99ecd\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.301351 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939aac29-edd6-4d03-a4f5-59541aa99ecd-host\") pod \"939aac29-edd6-4d03-a4f5-59541aa99ecd\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.301417 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/939aac29-edd6-4d03-a4f5-59541aa99ecd-host" (OuterVolumeSpecName: "host") pod "939aac29-edd6-4d03-a4f5-59541aa99ecd" (UID: "939aac29-edd6-4d03-a4f5-59541aa99ecd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.302894 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939aac29-edd6-4d03-a4f5-59541aa99ecd-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.307667 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/939aac29-edd6-4d03-a4f5-59541aa99ecd-kube-api-access-5m6bc" (OuterVolumeSpecName: "kube-api-access-5m6bc") pod "939aac29-edd6-4d03-a4f5-59541aa99ecd" (UID: "939aac29-edd6-4d03-a4f5-59541aa99ecd"). InnerVolumeSpecName "kube-api-access-5m6bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.405453 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m6bc\" (UniqueName: \"kubernetes.io/projected/939aac29-edd6-4d03-a4f5-59541aa99ecd-kube-api-access-5m6bc\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.099785 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30a87942c42ba58ae4535480008ce722d42f6808c3271d2c4c30afa2f5a8a212" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.100001 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.199636 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="939aac29-edd6-4d03-a4f5-59541aa99ecd" path="/var/lib/kubelet/pods/939aac29-edd6-4d03-a4f5-59541aa99ecd/volumes" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.401602 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-6dhtw"] Mar 18 11:10:46 crc kubenswrapper[4778]: E0318 11:10:46.402002 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939aac29-edd6-4d03-a4f5-59541aa99ecd" containerName="container-00" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.402019 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="939aac29-edd6-4d03-a4f5-59541aa99ecd" containerName="container-00" Mar 18 11:10:46 crc kubenswrapper[4778]: E0318 11:10:46.402052 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08809b1c-c749-4734-9fc4-6a0a755aa9cd" containerName="oc" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.402058 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="08809b1c-c749-4734-9fc4-6a0a755aa9cd" containerName="oc" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.402232 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="939aac29-edd6-4d03-a4f5-59541aa99ecd" containerName="container-00" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.402257 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="08809b1c-c749-4734-9fc4-6a0a755aa9cd" containerName="oc" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.402846 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.526460 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/481fa9ab-0ba7-4810-9a84-bb93c8762498-host\") pod \"crc-debug-6dhtw\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.526547 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk2dh\" (UniqueName: \"kubernetes.io/projected/481fa9ab-0ba7-4810-9a84-bb93c8762498-kube-api-access-vk2dh\") pod \"crc-debug-6dhtw\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.629334 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/481fa9ab-0ba7-4810-9a84-bb93c8762498-host\") pod \"crc-debug-6dhtw\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.629401 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk2dh\" (UniqueName: \"kubernetes.io/projected/481fa9ab-0ba7-4810-9a84-bb93c8762498-kube-api-access-vk2dh\") pod \"crc-debug-6dhtw\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.629504 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/481fa9ab-0ba7-4810-9a84-bb93c8762498-host\") pod \"crc-debug-6dhtw\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.656117 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk2dh\" (UniqueName: \"kubernetes.io/projected/481fa9ab-0ba7-4810-9a84-bb93c8762498-kube-api-access-vk2dh\") pod \"crc-debug-6dhtw\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.720591 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: W0318 11:10:46.753067 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481fa9ab_0ba7_4810_9a84_bb93c8762498.slice/crio-6d01cfea264e6eb627817e6ae0d743c3a37e4318bd87bcd78ce8d09f81529c8a WatchSource:0}: Error finding container 6d01cfea264e6eb627817e6ae0d743c3a37e4318bd87bcd78ce8d09f81529c8a: Status 404 returned error can't find the container with id 6d01cfea264e6eb627817e6ae0d743c3a37e4318bd87bcd78ce8d09f81529c8a Mar 18 11:10:47 crc kubenswrapper[4778]: I0318 11:10:47.109275 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" event={"ID":"481fa9ab-0ba7-4810-9a84-bb93c8762498","Type":"ContainerStarted","Data":"4244424fcad241de0fa7ed0597ece19cff3267ef03d0dbab6082ae06fe156bfa"} Mar 18 11:10:47 crc kubenswrapper[4778]: I0318 11:10:47.109633 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" event={"ID":"481fa9ab-0ba7-4810-9a84-bb93c8762498","Type":"ContainerStarted","Data":"6d01cfea264e6eb627817e6ae0d743c3a37e4318bd87bcd78ce8d09f81529c8a"} Mar 18 11:10:47 crc kubenswrapper[4778]: I0318 11:10:47.121368 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" podStartSLOduration=1.121351005 podStartE2EDuration="1.121351005s" podCreationTimestamp="2026-03-18 11:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:10:47.120546794 +0000 UTC m=+7713.695291634" watchObservedRunningTime="2026-03-18 11:10:47.121351005 +0000 UTC m=+7713.696095845" Mar 18 11:10:48 crc kubenswrapper[4778]: I0318 11:10:48.117302 4778 generic.go:334] "Generic (PLEG): container finished" podID="481fa9ab-0ba7-4810-9a84-bb93c8762498" containerID="4244424fcad241de0fa7ed0597ece19cff3267ef03d0dbab6082ae06fe156bfa" exitCode=0 Mar 18 11:10:48 crc kubenswrapper[4778]: I0318 11:10:48.117555 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" event={"ID":"481fa9ab-0ba7-4810-9a84-bb93c8762498","Type":"ContainerDied","Data":"4244424fcad241de0fa7ed0597ece19cff3267ef03d0dbab6082ae06fe156bfa"} Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.239761 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.380168 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk2dh\" (UniqueName: \"kubernetes.io/projected/481fa9ab-0ba7-4810-9a84-bb93c8762498-kube-api-access-vk2dh\") pod \"481fa9ab-0ba7-4810-9a84-bb93c8762498\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.380349 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/481fa9ab-0ba7-4810-9a84-bb93c8762498-host\") pod \"481fa9ab-0ba7-4810-9a84-bb93c8762498\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.380617 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/481fa9ab-0ba7-4810-9a84-bb93c8762498-host" (OuterVolumeSpecName: "host") pod "481fa9ab-0ba7-4810-9a84-bb93c8762498" (UID: "481fa9ab-0ba7-4810-9a84-bb93c8762498"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.381011 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/481fa9ab-0ba7-4810-9a84-bb93c8762498-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.388379 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481fa9ab-0ba7-4810-9a84-bb93c8762498-kube-api-access-vk2dh" (OuterVolumeSpecName: "kube-api-access-vk2dh") pod "481fa9ab-0ba7-4810-9a84-bb93c8762498" (UID: "481fa9ab-0ba7-4810-9a84-bb93c8762498"). InnerVolumeSpecName "kube-api-access-vk2dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.482334 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk2dh\" (UniqueName: \"kubernetes.io/projected/481fa9ab-0ba7-4810-9a84-bb93c8762498-kube-api-access-vk2dh\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.709654 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-6dhtw"] Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.717442 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-6dhtw"] Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.147054 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d01cfea264e6eb627817e6ae0d743c3a37e4318bd87bcd78ce8d09f81529c8a" Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.147120 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.197030 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481fa9ab-0ba7-4810-9a84-bb93c8762498" path="/var/lib/kubelet/pods/481fa9ab-0ba7-4810-9a84-bb93c8762498/volumes" Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.881448 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-hvrms"] Mar 18 11:10:50 crc kubenswrapper[4778]: E0318 11:10:50.882143 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481fa9ab-0ba7-4810-9a84-bb93c8762498" containerName="container-00" Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.882158 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="481fa9ab-0ba7-4810-9a84-bb93c8762498" containerName="container-00" Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.882454 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="481fa9ab-0ba7-4810-9a84-bb93c8762498" containerName="container-00" Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.883130 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.012435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-host\") pod \"crc-debug-hvrms\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.012581 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwvzt\" (UniqueName: \"kubernetes.io/projected/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-kube-api-access-pwvzt\") pod \"crc-debug-hvrms\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.115512 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-host\") pod \"crc-debug-hvrms\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.115619 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwvzt\" (UniqueName: \"kubernetes.io/projected/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-kube-api-access-pwvzt\") pod \"crc-debug-hvrms\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.115703 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-host\") pod \"crc-debug-hvrms\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.135384 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwvzt\" (UniqueName: \"kubernetes.io/projected/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-kube-api-access-pwvzt\") pod \"crc-debug-hvrms\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.205010 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: W0318 11:10:51.235380 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07b1b987_0fc7_4eb1_b6e3_2bc047d48992.slice/crio-ff888d19c52cc029981609bfcbbe53bf53880a3c77a2333cfce2657b67ef370d WatchSource:0}: Error finding container ff888d19c52cc029981609bfcbbe53bf53880a3c77a2333cfce2657b67ef370d: Status 404 returned error can't find the container with id ff888d19c52cc029981609bfcbbe53bf53880a3c77a2333cfce2657b67ef370d Mar 18 11:10:52 crc kubenswrapper[4778]: I0318 11:10:52.180308 4778 generic.go:334] "Generic (PLEG): container finished" podID="07b1b987-0fc7-4eb1-b6e3-2bc047d48992" containerID="22318d16c06934bfd283593ddd9c5161092c2f6f2dea9339dca1338b3f67afa1" exitCode=0 Mar 18 11:10:52 crc kubenswrapper[4778]: I0318 11:10:52.180458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-hvrms" event={"ID":"07b1b987-0fc7-4eb1-b6e3-2bc047d48992","Type":"ContainerDied","Data":"22318d16c06934bfd283593ddd9c5161092c2f6f2dea9339dca1338b3f67afa1"} Mar 18 11:10:52 crc kubenswrapper[4778]: I0318 11:10:52.180634 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-hvrms" event={"ID":"07b1b987-0fc7-4eb1-b6e3-2bc047d48992","Type":"ContainerStarted","Data":"ff888d19c52cc029981609bfcbbe53bf53880a3c77a2333cfce2657b67ef370d"} Mar 18 11:10:52 crc kubenswrapper[4778]: I0318 11:10:52.229314 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-hvrms"] Mar 18 11:10:52 crc kubenswrapper[4778]: I0318 11:10:52.239458 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-hvrms"] Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.290756 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.461796 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-host\") pod \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.462167 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwvzt\" (UniqueName: \"kubernetes.io/projected/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-kube-api-access-pwvzt\") pod \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.461942 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-host" (OuterVolumeSpecName: "host") pod "07b1b987-0fc7-4eb1-b6e3-2bc047d48992" (UID: "07b1b987-0fc7-4eb1-b6e3-2bc047d48992"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.462618 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.468014 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-kube-api-access-pwvzt" (OuterVolumeSpecName: "kube-api-access-pwvzt") pod "07b1b987-0fc7-4eb1-b6e3-2bc047d48992" (UID: "07b1b987-0fc7-4eb1-b6e3-2bc047d48992"). InnerVolumeSpecName "kube-api-access-pwvzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.564911 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwvzt\" (UniqueName: \"kubernetes.io/projected/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-kube-api-access-pwvzt\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:54 crc kubenswrapper[4778]: I0318 11:10:54.197739 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07b1b987-0fc7-4eb1-b6e3-2bc047d48992" path="/var/lib/kubelet/pods/07b1b987-0fc7-4eb1-b6e3-2bc047d48992/volumes" Mar 18 11:10:54 crc kubenswrapper[4778]: I0318 11:10:54.201485 4778 scope.go:117] "RemoveContainer" containerID="22318d16c06934bfd283593ddd9c5161092c2f6f2dea9339dca1338b3f67afa1" Mar 18 11:10:54 crc kubenswrapper[4778]: I0318 11:10:54.201535 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:58 crc kubenswrapper[4778]: E0318 11:10:58.187250 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.071864 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ansibletest-ansibletest_1fb58f5e-1c8b-45e2-bf86-b81af58b66a9/ansibletest-ansibletest/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.205147 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84d7458cd-cb86l_e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55/barbican-api/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.249562 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84d7458cd-cb86l_e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55/barbican-api-log/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.417383 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8bc77f476-tw7vd_3a006670-1a48-4421-8471-dd961c0e1d4c/barbican-keystone-listener/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.621939 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-769d964c9f-nxhk2_eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b/barbican-worker/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.676926 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-769d964c9f-nxhk2_eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b/barbican-worker-log/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.856262 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk_f4bddd5e-314b-49c0-963c-107e6798c40e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.950378 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8bc77f476-tw7vd_3a006670-1a48-4421-8471-dd961c0e1d4c/barbican-keystone-listener-log/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.024090 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/ceilometer-central-agent/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.089276 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/ceilometer-notification-agent/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.178965 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/proxy-httpd/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.231356 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/sg-core/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.304414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv_fed5a515-ed14-40f1-9282-4e87fe319bf6/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.398777 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl_34acd7f6-6263-4871-892c-02835ebbab27/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.598507 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51/cinder-api/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.621582 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51/cinder-api-log/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.877682 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a419ad60-27c7-4a74-a7a0-f6b04b3bcb13/cinder-backup/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.887872 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a419ad60-27c7-4a74-a7a0-f6b04b3bcb13/probe/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.986235 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_bbde13ad-dacc-4f17-8da3-109ede6972c0/cinder-scheduler/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.145715 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_bbde13ad-dacc-4f17-8da3-109ede6972c0/probe/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.276185 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_81d18509-d2fc-47e2-b814-94c4807a4dd6/probe/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.304558 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_81d18509-d2fc-47e2-b814-94c4807a4dd6/cinder-volume/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.385424 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5_d44d6afe-0030-4d9d-9fa7-f75274eff578/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.511876 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-r64nk_4f5bf2d2-78b2-4358-a582-482ab3020da3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.616955 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-pr7j8_78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3/init/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.835748 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-pr7j8_78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3/init/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.841042 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd/glance-httpd/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.994489 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-pr7j8_78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3/dnsmasq-dns/0.log" Mar 18 11:11:48 crc kubenswrapper[4778]: I0318 11:11:48.053362 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd/glance-log/0.log" Mar 18 11:11:48 crc kubenswrapper[4778]: I0318 11:11:48.101081 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a18a46b5-39a7-4da9-8994-5c4716bc0fc3/glance-httpd/0.log" Mar 18 11:11:48 crc kubenswrapper[4778]: I0318 11:11:48.171209 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a18a46b5-39a7-4da9-8994-5c4716bc0fc3/glance-log/0.log" Mar 18 11:11:48 crc kubenswrapper[4778]: I0318 11:11:48.322414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-644f48df4-b7jhq_e0a0a638-c445-4931-861e-d35704487c97/horizon/0.log" Mar 18 11:11:48 crc kubenswrapper[4778]: I0318 11:11:48.750486 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gfphd_7c70009e-cfb3-4598-9ae4-f1d90a2a63d5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:48 crc kubenswrapper[4778]: I0318 11:11:48.769837 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizontest-tests-horizontest_49ff1200-d42e-4022-990d-619169f357f4/horizontest-tests-horizontest/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.031484 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-9zw82_5e5ffed6-fceb-4d38-aa29-e9836a8d9f50/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.269651 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29563801-nctwn_8ace9f11-f4d8-4801-afa2-5b723d52d41e/keystone-cron/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.457683 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29563861-czdsg_d34b9add-0199-4bf1-81f8-fa4c2a9138e7/keystone-cron/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.566247 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1663e1b0-f9b0-4168-9386-abf2c1b56b43/kube-state-metrics/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.763145 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-644f48df4-b7jhq_e0a0a638-c445-4931-861e-d35704487c97/horizon-log/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.806317 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr_d50b5540-c2ca-4889-bbb0-3b5d04bc602f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.973753 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_35adb68e-2fb0-437c-bea7-e46f05e4918c/manila-api-log/0.log" Mar 18 11:11:50 crc kubenswrapper[4778]: I0318 11:11:50.226985 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e9af702d-3a1a-490e-82f5-e99c1718ef83/probe/0.log" Mar 18 11:11:50 crc kubenswrapper[4778]: I0318 11:11:50.243617 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e9af702d-3a1a-490e-82f5-e99c1718ef83/manila-scheduler/0.log" Mar 18 11:11:50 crc kubenswrapper[4778]: I0318 11:11:50.272662 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_35adb68e-2fb0-437c-bea7-e46f05e4918c/manila-api/0.log" Mar 18 11:11:50 crc kubenswrapper[4778]: I0318 11:11:50.456714 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_821dda0e-cde2-45a4-b23a-3d13565be515/probe/0.log" Mar 18 11:11:50 crc kubenswrapper[4778]: I0318 11:11:50.482022 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_821dda0e-cde2-45a4-b23a-3d13565be515/manila-share/0.log" Mar 18 11:11:51 crc kubenswrapper[4778]: I0318 11:11:51.206989 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v_52250b90-fbc6-418e-9a5f-4873d5fa5cd0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:51 crc kubenswrapper[4778]: I0318 11:11:51.719163 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d979499f7-4flxt_da263057-3652-4ae8-8435-4f80e4b13804/neutron-httpd/0.log" Mar 18 11:11:52 crc kubenswrapper[4778]: I0318 11:11:52.428479 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-75996d8fd4-jhtd2_4c045639-00d0-4ba6-9d75-c67934521e29/keystone-api/0.log" Mar 18 11:11:52 crc kubenswrapper[4778]: I0318 11:11:52.498988 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d979499f7-4flxt_da263057-3652-4ae8-8435-4f80e4b13804/neutron-api/0.log" Mar 18 11:11:53 crc kubenswrapper[4778]: I0318 11:11:53.418041 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9ba2a389-4009-4dab-bc75-45a574e50bbc/nova-cell1-conductor-conductor/0.log" Mar 18 11:11:53 crc kubenswrapper[4778]: I0318 11:11:53.465851 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3fc908a0-dc90-4df9-869c-5c0820cac423/nova-cell0-conductor-conductor/0.log" Mar 18 11:11:53 crc kubenswrapper[4778]: I0318 11:11:53.940572 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9549b39b-0fc5-4e89-b64a-de83c80735ed/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 11:11:54 crc kubenswrapper[4778]: I0318 11:11:54.149927 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9_b2db5491-57b4-427a-b306-5e525a1e7c27/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:54 crc kubenswrapper[4778]: I0318 11:11:54.468607 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_28f01ca6-f7d2-4de3-9aa9-256803533b80/nova-metadata-log/0.log" Mar 18 11:11:55 crc kubenswrapper[4778]: I0318 11:11:55.441731 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8a702c51-b7a6-4094-9d34-519102e1cf91/nova-api-log/0.log" Mar 18 11:11:55 crc kubenswrapper[4778]: I0318 11:11:55.578832 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9b1623d1-2084-419e-b36a-80930113a280/nova-scheduler-scheduler/0.log" Mar 18 11:11:55 crc kubenswrapper[4778]: I0318 11:11:55.731760 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_28f01ca6-f7d2-4de3-9aa9-256803533b80/nova-metadata-metadata/0.log" Mar 18 11:11:55 crc kubenswrapper[4778]: I0318 11:11:55.821605 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_49ce9560-3ee2-48d2-b016-a9feefb3a798/mysql-bootstrap/0.log" Mar 18 11:11:55 crc kubenswrapper[4778]: I0318 11:11:55.986658 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_49ce9560-3ee2-48d2-b016-a9feefb3a798/mysql-bootstrap/0.log" Mar 18 11:11:55 crc kubenswrapper[4778]: I0318 11:11:55.993848 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_49ce9560-3ee2-48d2-b016-a9feefb3a798/galera/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.167572 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cfadc08e-9e77-4b6f-be89-fc7c726e85b7/mysql-bootstrap/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.323032 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cfadc08e-9e77-4b6f-be89-fc7c726e85b7/mysql-bootstrap/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.400347 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cfadc08e-9e77-4b6f-be89-fc7c726e85b7/galera/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.499325 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fec302c3-e5fc-4019-b4f5-50de6bdde59f/openstackclient/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.616052 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-djmq6_f58533cf-4c57-4c3a-b772-e2a488298d7e/ovn-controller/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.674576 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8a702c51-b7a6-4094-9d34-519102e1cf91/nova-api-api/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.834908 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-2ldk7_2c6e8f7b-9b48-4814-9e73-fc9833c26cc9/openstack-network-exporter/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.854333 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovsdb-server-init/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.091085 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovsdb-server/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.094454 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovsdb-server-init/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.154518 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovs-vswitchd/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.313299 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7jqhd_1f0f4177-ad12-4848-bbd7-39b004344cb3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.415489 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac3419bd-88ba-4b83-bd93-ad5638bc7fd0/openstack-network-exporter/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.431694 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac3419bd-88ba-4b83-bd93-ad5638bc7fd0/ovn-northd/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.528277 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_113a3fc7-40a1-46f9-b93f-01a34fcaf4aa/openstack-network-exporter/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.617939 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_113a3fc7-40a1-46f9-b93f-01a34fcaf4aa/ovsdbserver-nb/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.731924 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_495e34ad-2f4d-46de-95e9-37b34a35f2d2/ovsdbserver-sb/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.741757 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_495e34ad-2f4d-46de-95e9-37b34a35f2d2/openstack-network-exporter/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.115583 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f9428cb3-4fdf-4b01-9368-28b413ecf82f/setup-container/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.270501 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f9428cb3-4fdf-4b01-9368-28b413ecf82f/setup-container/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.361668 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f9428cb3-4fdf-4b01-9368-28b413ecf82f/rabbitmq/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.496282 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7588d8786-t6x7l_fe0de426-6927-42ea-8b29-8bc01c27fe69/placement-api/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.577468 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0191a745-1fe2-4a1c-b007-96525ad39787/setup-container/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.618534 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7588d8786-t6x7l_fe0de426-6927-42ea-8b29-8bc01c27fe69/placement-log/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.740389 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0191a745-1fe2-4a1c-b007-96525ad39787/rabbitmq/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.758530 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0191a745-1fe2-4a1c-b007-96525ad39787/setup-container/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.877604 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7_613d0a31-a371-4c66-8254-85a7cc864fd0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.972912 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx_136dbfab-32f1-40ee-b685-74411fbc06ba/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.066407 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-9bss8_80a8d263-9bba-4db0-928e-f633b4ad5314/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.245815 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-j74ts_53b18647-af19-457c-9543-2156c1ace738/ssh-known-hosts-edpm-deployment/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.467706 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s00-full_757e3758-d646-4267-8c4c-b5efb0dcf709/tempest-tests-tempest-tests-runner/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.512921 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s01-single-test_c5a7a532-f8c2-4741-9892-65047a4cb225/tempest-tests-tempest-tests-runner/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.690807 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-horizontest-horizontest-tests-horizontest_3db5e33d-384f-4df3-bfb8-ba279b83f7e4/test-operator-logs-container/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.692326 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-ansibletest-ansibletest-ansibletest_1f57757d-6483-4e1a-9a09-e63026f73e70/test-operator-logs-container/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.938132 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_fb176b71-d782-4b0d-963f-94acef50cf11/test-operator-logs-container/0.log" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.022132 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tobiko-tobiko-tests-tobiko_4e028d5e-666c-497c-949e-97860410ad74/test-operator-logs-container/0.log" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.159369 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s00-podified-functional_5c0d8cb1-d7bc-4694-ac54-e0a9f8312557/tobiko-tests-tobiko/0.log" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.177652 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563872-w9xc8"] Mar 18 11:12:00 crc kubenswrapper[4778]: E0318 11:12:00.178054 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b1b987-0fc7-4eb1-b6e3-2bc047d48992" containerName="container-00" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.178071 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b1b987-0fc7-4eb1-b6e3-2bc047d48992" containerName="container-00" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.178630 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="07b1b987-0fc7-4eb1-b6e3-2bc047d48992" containerName="container-00" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.179328 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.181730 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.182004 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.182333 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.203062 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563872-w9xc8"] Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.247047 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s01-sanity_bd565818-8912-47ba-881f-f88011fa9b46/tobiko-tests-tobiko/0.log" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.281027 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvtdz\" (UniqueName: \"kubernetes.io/projected/c2575542-201b-40c8-baec-f64e53f357a6-kube-api-access-pvtdz\") pod \"auto-csr-approver-29563872-w9xc8\" (UID: \"c2575542-201b-40c8-baec-f64e53f357a6\") " pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.383093 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvtdz\" (UniqueName: \"kubernetes.io/projected/c2575542-201b-40c8-baec-f64e53f357a6-kube-api-access-pvtdz\") pod \"auto-csr-approver-29563872-w9xc8\" (UID: \"c2575542-201b-40c8-baec-f64e53f357a6\") " pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.404730 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvtdz\" (UniqueName: \"kubernetes.io/projected/c2575542-201b-40c8-baec-f64e53f357a6-kube-api-access-pvtdz\") pod \"auto-csr-approver-29563872-w9xc8\" (UID: \"c2575542-201b-40c8-baec-f64e53f357a6\") " pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.409617 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-44vc9_5e5ecb95-ba90-4f70-ae42-63e71026ffef/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.518265 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:01 crc kubenswrapper[4778]: I0318 11:12:01.045339 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 11:12:01 crc kubenswrapper[4778]: I0318 11:12:01.045720 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563872-w9xc8"] Mar 18 11:12:01 crc kubenswrapper[4778]: I0318 11:12:01.864601 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" event={"ID":"c2575542-201b-40c8-baec-f64e53f357a6","Type":"ContainerStarted","Data":"19da28c3a338cd150a9db77cc1bbccafffaa4df006145a305a3db3de00e58567"} Mar 18 11:12:02 crc kubenswrapper[4778]: I0318 11:12:02.877051 4778 generic.go:334] "Generic (PLEG): container finished" podID="c2575542-201b-40c8-baec-f64e53f357a6" containerID="9b9c4586ce364f21cf8a583a2a00575bf65854ca35b9f67e292350a899db8fd9" exitCode=0 Mar 18 11:12:02 crc kubenswrapper[4778]: I0318 11:12:02.877287 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" event={"ID":"c2575542-201b-40c8-baec-f64e53f357a6","Type":"ContainerDied","Data":"9b9c4586ce364f21cf8a583a2a00575bf65854ca35b9f67e292350a899db8fd9"} Mar 18 11:12:03 crc kubenswrapper[4778]: I0318 11:12:03.030031 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_fc50d224-cd65-4a46-b3d0-b40acdbda53d/memcached/0.log" Mar 18 11:12:03 crc kubenswrapper[4778]: E0318 11:12:03.188271 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.264958 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.366624 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvtdz\" (UniqueName: \"kubernetes.io/projected/c2575542-201b-40c8-baec-f64e53f357a6-kube-api-access-pvtdz\") pod \"c2575542-201b-40c8-baec-f64e53f357a6\" (UID: \"c2575542-201b-40c8-baec-f64e53f357a6\") " Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.373430 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2575542-201b-40c8-baec-f64e53f357a6-kube-api-access-pvtdz" (OuterVolumeSpecName: "kube-api-access-pvtdz") pod "c2575542-201b-40c8-baec-f64e53f357a6" (UID: "c2575542-201b-40c8-baec-f64e53f357a6"). InnerVolumeSpecName "kube-api-access-pvtdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.469832 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvtdz\" (UniqueName: \"kubernetes.io/projected/c2575542-201b-40c8-baec-f64e53f357a6-kube-api-access-pvtdz\") on node \"crc\" DevicePath \"\"" Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.895057 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" event={"ID":"c2575542-201b-40c8-baec-f64e53f357a6","Type":"ContainerDied","Data":"19da28c3a338cd150a9db77cc1bbccafffaa4df006145a305a3db3de00e58567"} Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.895087 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.895101 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19da28c3a338cd150a9db77cc1bbccafffaa4df006145a305a3db3de00e58567" Mar 18 11:12:05 crc kubenswrapper[4778]: I0318 11:12:05.331737 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563866-8zx72"] Mar 18 11:12:05 crc kubenswrapper[4778]: I0318 11:12:05.341367 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563866-8zx72"] Mar 18 11:12:06 crc kubenswrapper[4778]: I0318 11:12:06.198918 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b734773-4a1f-4acd-80e9-e3cd0cf14c2c" path="/var/lib/kubelet/pods/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c/volumes" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.512363 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cp6wg"] Mar 18 11:12:08 crc kubenswrapper[4778]: E0318 11:12:08.513118 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2575542-201b-40c8-baec-f64e53f357a6" containerName="oc" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.513131 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2575542-201b-40c8-baec-f64e53f357a6" containerName="oc" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.513355 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2575542-201b-40c8-baec-f64e53f357a6" containerName="oc" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.514712 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.526912 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cp6wg"] Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.553785 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-catalog-content\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.553924 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ntf8\" (UniqueName: \"kubernetes.io/projected/32705561-f15c-4a55-b595-5154e5c0f483-kube-api-access-8ntf8\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.553971 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-utilities\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.656328 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-catalog-content\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.656705 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ntf8\" (UniqueName: \"kubernetes.io/projected/32705561-f15c-4a55-b595-5154e5c0f483-kube-api-access-8ntf8\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.656760 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-utilities\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.657114 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-catalog-content\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.657431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-utilities\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.687970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ntf8\" (UniqueName: \"kubernetes.io/projected/32705561-f15c-4a55-b595-5154e5c0f483-kube-api-access-8ntf8\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.832688 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:09 crc kubenswrapper[4778]: I0318 11:12:09.218738 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cp6wg"] Mar 18 11:12:09 crc kubenswrapper[4778]: I0318 11:12:09.937290 4778 generic.go:334] "Generic (PLEG): container finished" podID="32705561-f15c-4a55-b595-5154e5c0f483" containerID="10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578" exitCode=0 Mar 18 11:12:09 crc kubenswrapper[4778]: I0318 11:12:09.937394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerDied","Data":"10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578"} Mar 18 11:12:09 crc kubenswrapper[4778]: I0318 11:12:09.938740 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerStarted","Data":"ec0a112d6285ed2e19ae645c8a2740504d15449ace3b08259fecdc0e181b8428"} Mar 18 11:12:10 crc kubenswrapper[4778]: I0318 11:12:10.947866 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerStarted","Data":"0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f"} Mar 18 11:12:12 crc kubenswrapper[4778]: I0318 11:12:12.965437 4778 generic.go:334] "Generic (PLEG): container finished" podID="32705561-f15c-4a55-b595-5154e5c0f483" containerID="0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f" exitCode=0 Mar 18 11:12:12 crc kubenswrapper[4778]: I0318 11:12:12.965497 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerDied","Data":"0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f"} Mar 18 11:12:13 crc kubenswrapper[4778]: I0318 11:12:13.977115 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerStarted","Data":"0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed"} Mar 18 11:12:14 crc kubenswrapper[4778]: I0318 11:12:14.001378 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cp6wg" podStartSLOduration=2.332215041 podStartE2EDuration="6.001360325s" podCreationTimestamp="2026-03-18 11:12:08 +0000 UTC" firstStartedPulling="2026-03-18 11:12:09.939990808 +0000 UTC m=+7796.514735648" lastFinishedPulling="2026-03-18 11:12:13.609136092 +0000 UTC m=+7800.183880932" observedRunningTime="2026-03-18 11:12:13.993765688 +0000 UTC m=+7800.568510558" watchObservedRunningTime="2026-03-18 11:12:14.001360325 +0000 UTC m=+7800.576105165" Mar 18 11:12:18 crc kubenswrapper[4778]: I0318 11:12:18.833413 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:18 crc kubenswrapper[4778]: I0318 11:12:18.834085 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:18 crc kubenswrapper[4778]: I0318 11:12:18.912327 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:19 crc kubenswrapper[4778]: I0318 11:12:19.068491 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:19 crc kubenswrapper[4778]: I0318 11:12:19.147071 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cp6wg"] Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.037688 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cp6wg" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="registry-server" containerID="cri-o://0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed" gracePeriod=2 Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.570051 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.710447 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ntf8\" (UniqueName: \"kubernetes.io/projected/32705561-f15c-4a55-b595-5154e5c0f483-kube-api-access-8ntf8\") pod \"32705561-f15c-4a55-b595-5154e5c0f483\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.710548 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-utilities\") pod \"32705561-f15c-4a55-b595-5154e5c0f483\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.710624 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-catalog-content\") pod \"32705561-f15c-4a55-b595-5154e5c0f483\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.711587 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-utilities" (OuterVolumeSpecName: "utilities") pod "32705561-f15c-4a55-b595-5154e5c0f483" (UID: "32705561-f15c-4a55-b595-5154e5c0f483"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.717223 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32705561-f15c-4a55-b595-5154e5c0f483-kube-api-access-8ntf8" (OuterVolumeSpecName: "kube-api-access-8ntf8") pod "32705561-f15c-4a55-b595-5154e5c0f483" (UID: "32705561-f15c-4a55-b595-5154e5c0f483"). InnerVolumeSpecName "kube-api-access-8ntf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.812573 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ntf8\" (UniqueName: \"kubernetes.io/projected/32705561-f15c-4a55-b595-5154e5c0f483-kube-api-access-8ntf8\") on node \"crc\" DevicePath \"\"" Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.812612 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.902135 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32705561-f15c-4a55-b595-5154e5c0f483" (UID: "32705561-f15c-4a55-b595-5154e5c0f483"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.913895 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.049026 4778 generic.go:334] "Generic (PLEG): container finished" podID="32705561-f15c-4a55-b595-5154e5c0f483" containerID="0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed" exitCode=0 Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.049071 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerDied","Data":"0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed"} Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.049102 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerDied","Data":"ec0a112d6285ed2e19ae645c8a2740504d15449ace3b08259fecdc0e181b8428"} Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.049103 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.049124 4778 scope.go:117] "RemoveContainer" containerID="0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.068715 4778 scope.go:117] "RemoveContainer" containerID="0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.098091 4778 scope.go:117] "RemoveContainer" containerID="10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.101509 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cp6wg"] Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.110523 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cp6wg"] Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.147539 4778 scope.go:117] "RemoveContainer" containerID="0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed" Mar 18 11:12:22 crc kubenswrapper[4778]: E0318 11:12:22.148095 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed\": container with ID starting with 0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed not found: ID does not exist" containerID="0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.148134 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed"} err="failed to get container status \"0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed\": rpc error: code = NotFound desc = could not find container \"0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed\": container with ID starting with 0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed not found: ID does not exist" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.148159 4778 scope.go:117] "RemoveContainer" containerID="0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f" Mar 18 11:12:22 crc kubenswrapper[4778]: E0318 11:12:22.148578 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f\": container with ID starting with 0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f not found: ID does not exist" containerID="0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.148619 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f"} err="failed to get container status \"0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f\": rpc error: code = NotFound desc = could not find container \"0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f\": container with ID starting with 0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f not found: ID does not exist" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.148645 4778 scope.go:117] "RemoveContainer" containerID="10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578" Mar 18 11:12:22 crc kubenswrapper[4778]: E0318 11:12:22.155374 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578\": container with ID starting with 10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578 not found: ID does not exist" containerID="10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.155441 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578"} err="failed to get container status \"10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578\": rpc error: code = NotFound desc = could not find container \"10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578\": container with ID starting with 10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578 not found: ID does not exist" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.202126 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32705561-f15c-4a55-b595-5154e5c0f483" path="/var/lib/kubelet/pods/32705561-f15c-4a55-b595-5154e5c0f483/volumes" Mar 18 11:12:23 crc kubenswrapper[4778]: I0318 11:12:23.896913 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-fsxlt_3390909b-6271-40dd-9662-0710f6866143/manager/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.174824 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-7mbx2_710ababb-0bee-441d-8dd0-e6a72ea2b2e3/manager/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.335161 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/util/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.581849 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/pull/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.625981 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/pull/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.694381 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/util/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.846544 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/util/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.889110 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/pull/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.961943 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/extract/0.log" Mar 18 11:12:25 crc kubenswrapper[4778]: I0318 11:12:25.150838 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-wb4pc_b41dbd4a-33dd-4dca-9356-34c740e8063f/manager/0.log" Mar 18 11:12:25 crc kubenswrapper[4778]: I0318 11:12:25.282126 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-t5c4w_aceb2f7b-585f-451a-83b8-e673965ada87/manager/0.log" Mar 18 11:12:25 crc kubenswrapper[4778]: I0318 11:12:25.392369 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-x7rnp_124dc549-cb2a-4b1c-a610-093cf9b8c05d/manager/0.log" Mar 18 11:12:25 crc kubenswrapper[4778]: I0318 11:12:25.771669 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-fjjvl_3c86f76c-1617-45e9-9573-f6fd51803b45/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.051876 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-5xvtc_e1ec7bae-8e15-4844-84d2-ff5951d0be31/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.062235 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-64c4x_66d3bf3a-086c-4340-ba73-209f526fc33c/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.291248 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-zpc92_211c991a-9406-4360-aa7f-830be3aa55db/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.390691 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-wxftc_0526f654-9ddc-4495-bb04-be13e53b6a1b/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.430878 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-47sbc_37675366-70a8-4e0b-b92b-f7055547d918/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.541225 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-k4r2p_ae690990-eeb1-4871-8c51-dd3b547e1193/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.712300 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-h6whs_e245908e-e35e-403c-93f6-48371904ae42/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.758104 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-pzjdt_c776af1e-ad54-40fe-9bed-a0a09ce0eea7/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.883725 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-xdgmv_80822932-2943-4f81-9436-1553ed031359/manager/0.log" Mar 18 11:12:27 crc kubenswrapper[4778]: I0318 11:12:27.064411 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-654f4fc7f7-9d4pb_b8267dff-2541-481e-bc64-13eb8d19300b/operator/0.log" Mar 18 11:12:27 crc kubenswrapper[4778]: I0318 11:12:27.227303 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-v7qxm_c508c810-232f-48c1-8d15-bbbb118d2948/registry-server/0.log" Mar 18 11:12:27 crc kubenswrapper[4778]: I0318 11:12:27.343658 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-fgfk9_208b26f2-3c91-4966-9d01-8fe73e4a7d87/manager/0.log" Mar 18 11:12:27 crc kubenswrapper[4778]: I0318 11:12:27.516106 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-d5w9q_2f8e8860-00a1-43fc-9776-c617f270cc50/manager/0.log" Mar 18 11:12:27 crc kubenswrapper[4778]: I0318 11:12:27.675311 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5jrv8_b837636e-8c09-42b7-9a81-e7875df68344/operator/0.log" Mar 18 11:12:27 crc kubenswrapper[4778]: I0318 11:12:27.797242 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-c6l5k_8ccabb3b-da59-4ab0-89c8-99094a939f0d/manager/0.log" Mar 18 11:12:28 crc kubenswrapper[4778]: I0318 11:12:28.142944 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-tx9zq_9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77/manager/0.log" Mar 18 11:12:28 crc kubenswrapper[4778]: I0318 11:12:28.213367 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-54c5f5bc8-jsm76_99adb6be-2a3e-4148-8074-9258222ebd60/manager/0.log" Mar 18 11:12:28 crc kubenswrapper[4778]: I0318 11:12:28.345109 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-sgs49_57277339-c9be-4de1-8e35-72ae98d33905/manager/0.log" Mar 18 11:12:28 crc kubenswrapper[4778]: I0318 11:12:28.347037 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-f5c7df4d7-m4kvr_3c7e3158-5139-467d-b33c-808747f0d9be/manager/0.log" Mar 18 11:12:30 crc kubenswrapper[4778]: I0318 11:12:30.148081 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:12:30 crc kubenswrapper[4778]: I0318 11:12:30.148582 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:12:38 crc kubenswrapper[4778]: I0318 11:12:38.714103 4778 scope.go:117] "RemoveContainer" containerID="9ce4ad858c60f25a18c86f0360777510f04c706cb5eafb4da8787fc9df1829e5" Mar 18 11:12:47 crc kubenswrapper[4778]: I0318 11:12:47.076685 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qtggn_ba84f396-0169-4d5e-a126-60ac9d6d49f8/control-plane-machine-set-operator/0.log" Mar 18 11:12:47 crc kubenswrapper[4778]: I0318 11:12:47.336924 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-smtz9_f06790e0-cf8c-48f0-8d48-893663fdbd1c/kube-rbac-proxy/0.log" Mar 18 11:12:47 crc kubenswrapper[4778]: I0318 11:12:47.377069 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-smtz9_f06790e0-cf8c-48f0-8d48-893663fdbd1c/machine-api-operator/0.log" Mar 18 11:13:00 crc kubenswrapper[4778]: I0318 11:13:00.147630 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:13:00 crc kubenswrapper[4778]: I0318 11:13:00.148280 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:13:00 crc kubenswrapper[4778]: I0318 11:13:00.321491 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-qrqw4_e39be52c-c244-44cc-a707-0ec9994991fa/cert-manager-controller/0.log" Mar 18 11:13:00 crc kubenswrapper[4778]: I0318 11:13:00.515121 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-khqrg_24a88e8d-e986-4b3d-a77e-1a3e5162ac9c/cert-manager-cainjector/0.log" Mar 18 11:13:00 crc kubenswrapper[4778]: I0318 11:13:00.541193 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-hjskg_f09bc4b7-d305-4674-8540-283bd0b4901c/cert-manager-webhook/0.log" Mar 18 11:13:13 crc kubenswrapper[4778]: I0318 11:13:13.540061 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-22c9p_8b636ef7-4b85-4506-bb2a-f89bee9b028d/nmstate-console-plugin/0.log" Mar 18 11:13:13 crc kubenswrapper[4778]: I0318 11:13:13.688399 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5thsf_5b97fa25-4d3d-4664-a5fc-41c98bbd272f/nmstate-handler/0.log" Mar 18 11:13:13 crc kubenswrapper[4778]: I0318 11:13:13.729304 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wq8gr_71b50b27-6084-4693-acbc-d14f36759618/kube-rbac-proxy/0.log" Mar 18 11:13:13 crc kubenswrapper[4778]: I0318 11:13:13.780742 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wq8gr_71b50b27-6084-4693-acbc-d14f36759618/nmstate-metrics/0.log" Mar 18 11:13:13 crc kubenswrapper[4778]: I0318 11:13:13.959410 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-sr9ls_1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe/nmstate-operator/0.log" Mar 18 11:13:14 crc kubenswrapper[4778]: I0318 11:13:14.010366 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-thw7f_5961b98d-a41a-4ceb-bb71-4bf3a0fc854d/nmstate-webhook/0.log" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.749309 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k6xz8"] Mar 18 11:13:23 crc kubenswrapper[4778]: E0318 11:13:23.750052 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="extract-utilities" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.750064 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="extract-utilities" Mar 18 11:13:23 crc kubenswrapper[4778]: E0318 11:13:23.750082 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="registry-server" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.750088 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="registry-server" Mar 18 11:13:23 crc kubenswrapper[4778]: E0318 11:13:23.750115 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="extract-content" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.750121 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="extract-content" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.750310 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="registry-server" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.751499 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.765695 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6xz8"] Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.937979 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwhs5\" (UniqueName: \"kubernetes.io/projected/7b138f36-1b83-46c2-bcff-84a0f03d3921-kube-api-access-zwhs5\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.938959 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-catalog-content\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.939097 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-utilities\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.040971 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-utilities\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.041151 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwhs5\" (UniqueName: \"kubernetes.io/projected/7b138f36-1b83-46c2-bcff-84a0f03d3921-kube-api-access-zwhs5\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.041203 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-catalog-content\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.041473 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-utilities\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.041538 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-catalog-content\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.062786 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwhs5\" (UniqueName: \"kubernetes.io/projected/7b138f36-1b83-46c2-bcff-84a0f03d3921-kube-api-access-zwhs5\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.072729 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.611465 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6xz8"] Mar 18 11:13:24 crc kubenswrapper[4778]: W0318 11:13:24.612514 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b138f36_1b83_46c2_bcff_84a0f03d3921.slice/crio-780daef24cf10bd44e52b7e1feeb9ea337d24c2ed2241a25071fdf982ddb4787 WatchSource:0}: Error finding container 780daef24cf10bd44e52b7e1feeb9ea337d24c2ed2241a25071fdf982ddb4787: Status 404 returned error can't find the container with id 780daef24cf10bd44e52b7e1feeb9ea337d24c2ed2241a25071fdf982ddb4787 Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.676069 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerStarted","Data":"780daef24cf10bd44e52b7e1feeb9ea337d24c2ed2241a25071fdf982ddb4787"} Mar 18 11:13:25 crc kubenswrapper[4778]: I0318 11:13:25.684932 4778 generic.go:334] "Generic (PLEG): container finished" podID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerID="91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76" exitCode=0 Mar 18 11:13:25 crc kubenswrapper[4778]: I0318 11:13:25.684984 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerDied","Data":"91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76"} Mar 18 11:13:27 crc kubenswrapper[4778]: I0318 11:13:27.702488 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerStarted","Data":"270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33"} Mar 18 11:13:28 crc kubenswrapper[4778]: E0318 11:13:28.187122 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:13:28 crc kubenswrapper[4778]: I0318 11:13:28.713592 4778 generic.go:334] "Generic (PLEG): container finished" podID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerID="270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33" exitCode=0 Mar 18 11:13:28 crc kubenswrapper[4778]: I0318 11:13:28.713660 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerDied","Data":"270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33"} Mar 18 11:13:29 crc kubenswrapper[4778]: I0318 11:13:29.725955 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerStarted","Data":"6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390"} Mar 18 11:13:29 crc kubenswrapper[4778]: I0318 11:13:29.754113 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k6xz8" podStartSLOduration=3.277500829 podStartE2EDuration="6.754094914s" podCreationTimestamp="2026-03-18 11:13:23 +0000 UTC" firstStartedPulling="2026-03-18 11:13:25.68789368 +0000 UTC m=+7872.262638520" lastFinishedPulling="2026-03-18 11:13:29.164487765 +0000 UTC m=+7875.739232605" observedRunningTime="2026-03-18 11:13:29.74659213 +0000 UTC m=+7876.321336970" watchObservedRunningTime="2026-03-18 11:13:29.754094914 +0000 UTC m=+7876.328839744" Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.147263 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.147540 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.147584 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.148382 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.148438 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" gracePeriod=600 Mar 18 11:13:30 crc kubenswrapper[4778]: E0318 11:13:30.303937 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.736408 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" exitCode=0 Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.736505 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434"} Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.736588 4778 scope.go:117] "RemoveContainer" containerID="cbe46d0fa65f0ead7ad1789d13ec33e6d93d408d865f08e3bbc666ffb661db35" Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.737680 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:13:30 crc kubenswrapper[4778]: E0318 11:13:30.738053 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:13:34 crc kubenswrapper[4778]: I0318 11:13:34.073076 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:34 crc kubenswrapper[4778]: I0318 11:13:34.073534 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:34 crc kubenswrapper[4778]: I0318 11:13:34.160394 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:34 crc kubenswrapper[4778]: I0318 11:13:34.867404 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:34 crc kubenswrapper[4778]: I0318 11:13:34.933718 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6xz8"] Mar 18 11:13:36 crc kubenswrapper[4778]: I0318 11:13:36.838070 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k6xz8" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="registry-server" containerID="cri-o://6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390" gracePeriod=2 Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.278942 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.409398 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-utilities\") pod \"7b138f36-1b83-46c2-bcff-84a0f03d3921\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.409514 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-catalog-content\") pod \"7b138f36-1b83-46c2-bcff-84a0f03d3921\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.409708 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwhs5\" (UniqueName: \"kubernetes.io/projected/7b138f36-1b83-46c2-bcff-84a0f03d3921-kube-api-access-zwhs5\") pod \"7b138f36-1b83-46c2-bcff-84a0f03d3921\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.422147 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-utilities" (OuterVolumeSpecName: "utilities") pod "7b138f36-1b83-46c2-bcff-84a0f03d3921" (UID: "7b138f36-1b83-46c2-bcff-84a0f03d3921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.422260 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b138f36-1b83-46c2-bcff-84a0f03d3921-kube-api-access-zwhs5" (OuterVolumeSpecName: "kube-api-access-zwhs5") pod "7b138f36-1b83-46c2-bcff-84a0f03d3921" (UID: "7b138f36-1b83-46c2-bcff-84a0f03d3921"). InnerVolumeSpecName "kube-api-access-zwhs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.465914 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b138f36-1b83-46c2-bcff-84a0f03d3921" (UID: "7b138f36-1b83-46c2-bcff-84a0f03d3921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.512407 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwhs5\" (UniqueName: \"kubernetes.io/projected/7b138f36-1b83-46c2-bcff-84a0f03d3921-kube-api-access-zwhs5\") on node \"crc\" DevicePath \"\"" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.512444 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.512457 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.853003 4778 generic.go:334] "Generic (PLEG): container finished" podID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerID="6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390" exitCode=0 Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.853050 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerDied","Data":"6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390"} Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.853083 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerDied","Data":"780daef24cf10bd44e52b7e1feeb9ea337d24c2ed2241a25071fdf982ddb4787"} Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.853102 4778 scope.go:117] "RemoveContainer" containerID="6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.853287 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.879518 4778 scope.go:117] "RemoveContainer" containerID="270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.890326 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6xz8"] Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.901510 4778 scope.go:117] "RemoveContainer" containerID="91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.902295 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k6xz8"] Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.942745 4778 scope.go:117] "RemoveContainer" containerID="6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390" Mar 18 11:13:37 crc kubenswrapper[4778]: E0318 11:13:37.943124 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390\": container with ID starting with 6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390 not found: ID does not exist" containerID="6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.943164 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390"} err="failed to get container status \"6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390\": rpc error: code = NotFound desc = could not find container \"6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390\": container with ID starting with 6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390 not found: ID does not exist" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.943190 4778 scope.go:117] "RemoveContainer" containerID="270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33" Mar 18 11:13:37 crc kubenswrapper[4778]: E0318 11:13:37.943554 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33\": container with ID starting with 270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33 not found: ID does not exist" containerID="270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.943583 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33"} err="failed to get container status \"270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33\": rpc error: code = NotFound desc = could not find container \"270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33\": container with ID starting with 270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33 not found: ID does not exist" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.943605 4778 scope.go:117] "RemoveContainer" containerID="91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76" Mar 18 11:13:37 crc kubenswrapper[4778]: E0318 11:13:37.943896 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76\": container with ID starting with 91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76 not found: ID does not exist" containerID="91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.943936 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76"} err="failed to get container status \"91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76\": rpc error: code = NotFound desc = could not find container \"91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76\": container with ID starting with 91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76 not found: ID does not exist" Mar 18 11:13:38 crc kubenswrapper[4778]: I0318 11:13:38.196920 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" path="/var/lib/kubelet/pods/7b138f36-1b83-46c2-bcff-84a0f03d3921/volumes" Mar 18 11:13:42 crc kubenswrapper[4778]: I0318 11:13:42.642414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-sv9kd_1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c/kube-rbac-proxy/0.log" Mar 18 11:13:42 crc kubenswrapper[4778]: I0318 11:13:42.730635 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-sv9kd_1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c/controller/0.log" Mar 18 11:13:42 crc kubenswrapper[4778]: I0318 11:13:42.869287 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.025767 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.083345 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.084954 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.113919 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.295758 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.297994 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.331357 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.362347 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.508118 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.555142 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.620469 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.637029 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/controller/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.742069 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/frr-metrics/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.842709 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/kube-rbac-proxy/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.858459 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/kube-rbac-proxy-frr/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.948281 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/reloader/0.log" Mar 18 11:13:44 crc kubenswrapper[4778]: I0318 11:13:44.116712 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jrtjv_0f18e9f0-b3eb-440a-b035-ed8256df5ed9/frr-k8s-webhook-server/0.log" Mar 18 11:13:44 crc kubenswrapper[4778]: I0318 11:13:44.272005 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-78856dcdc4-9cltx_721ee07f-fded-43ab-9bb7-2e4e56c98515/manager/0.log" Mar 18 11:13:44 crc kubenswrapper[4778]: I0318 11:13:44.361287 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b499db45c-c5tcr_75885bb8-adce-4801-8941-75042ab330ea/webhook-server/0.log" Mar 18 11:13:44 crc kubenswrapper[4778]: I0318 11:13:44.780344 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wd69x_1c97662e-d673-42c1-a6ad-75865ba2b8b6/kube-rbac-proxy/0.log" Mar 18 11:13:45 crc kubenswrapper[4778]: I0318 11:13:45.187755 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:13:45 crc kubenswrapper[4778]: E0318 11:13:45.188075 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:13:45 crc kubenswrapper[4778]: I0318 11:13:45.360944 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wd69x_1c97662e-d673-42c1-a6ad-75865ba2b8b6/speaker/0.log" Mar 18 11:13:46 crc kubenswrapper[4778]: I0318 11:13:46.025744 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/frr/0.log" Mar 18 11:13:58 crc kubenswrapper[4778]: I0318 11:13:58.188446 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:13:58 crc kubenswrapper[4778]: E0318 11:13:58.189167 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.110019 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/util/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.339957 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/util/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.341638 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/pull/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.444920 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/pull/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.592783 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/util/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.604054 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/extract/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.616538 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/pull/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.776087 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/util/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.009785 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/util/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.047098 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/pull/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.072137 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/pull/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.156871 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563874-hj4l4"] Mar 18 11:14:00 crc kubenswrapper[4778]: E0318 11:14:00.157332 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="registry-server" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.157348 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="registry-server" Mar 18 11:14:00 crc kubenswrapper[4778]: E0318 11:14:00.157374 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="extract-content" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.157381 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="extract-content" Mar 18 11:14:00 crc kubenswrapper[4778]: E0318 11:14:00.157402 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="extract-utilities" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.157411 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="extract-utilities" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.157685 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="registry-server" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.158519 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.160213 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.161083 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.161888 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.173893 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563874-hj4l4"] Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.250405 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/extract/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.273854 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95jdw\" (UniqueName: \"kubernetes.io/projected/894be30f-4dc9-4a4c-b443-2393b89df180-kube-api-access-95jdw\") pod \"auto-csr-approver-29563874-hj4l4\" (UID: \"894be30f-4dc9-4a4c-b443-2393b89df180\") " pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.295666 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/util/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.317125 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/pull/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.376131 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95jdw\" (UniqueName: \"kubernetes.io/projected/894be30f-4dc9-4a4c-b443-2393b89df180-kube-api-access-95jdw\") pod \"auto-csr-approver-29563874-hj4l4\" (UID: \"894be30f-4dc9-4a4c-b443-2393b89df180\") " pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.399016 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95jdw\" (UniqueName: \"kubernetes.io/projected/894be30f-4dc9-4a4c-b443-2393b89df180-kube-api-access-95jdw\") pod \"auto-csr-approver-29563874-hj4l4\" (UID: \"894be30f-4dc9-4a4c-b443-2393b89df180\") " pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.437047 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-utilities/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.477085 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.682371 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-utilities/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.724395 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-content/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.736272 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-content/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.011819 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563874-hj4l4"] Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.044445 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" event={"ID":"894be30f-4dc9-4a4c-b443-2393b89df180","Type":"ContainerStarted","Data":"552bca3d7cacd76ec3cc9a9d5dae7c12a8765fdd6cc731d4701158a4a76c124e"} Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.150800 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-content/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.197544 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-utilities/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.384860 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-utilities/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.569872 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-utilities/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.584409 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-content/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.682332 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-content/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.874879 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-utilities/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.933958 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-content/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.081759 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/registry-server/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.130415 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jj774_e037e8cd-1543-49a8-9389-4cc6f440c4b3/marketplace-operator/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.337937 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-utilities/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.527514 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-utilities/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.641071 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-content/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.719423 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-content/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.754850 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/registry-server/0.log" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.081016 4778 generic.go:334] "Generic (PLEG): container finished" podID="894be30f-4dc9-4a4c-b443-2393b89df180" containerID="8d49e3ed1fe8885369ea3edca8ac073df8e6a13bd130a54cf891e878c3833fb8" exitCode=0 Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.081057 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" event={"ID":"894be30f-4dc9-4a4c-b443-2393b89df180","Type":"ContainerDied","Data":"8d49e3ed1fe8885369ea3edca8ac073df8e6a13bd130a54cf891e878c3833fb8"} Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.179495 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5fvxm"] Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.181469 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.190671 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fvxm"] Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.209385 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-utilities/0.log" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.209719 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-content/0.log" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.234830 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-utilities\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.234912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-catalog-content\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.234967 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f8tm\" (UniqueName: \"kubernetes.io/projected/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-kube-api-access-9f8tm\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.336623 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-utilities\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.336692 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-catalog-content\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.336780 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f8tm\" (UniqueName: \"kubernetes.io/projected/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-kube-api-access-9f8tm\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.338708 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-utilities\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.341307 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-catalog-content\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.355421 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f8tm\" (UniqueName: \"kubernetes.io/projected/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-kube-api-access-9f8tm\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.470282 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-utilities/0.log" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.518008 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.542298 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/registry-server/0.log" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.831125 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fvxm"] Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.944609 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-content/0.log" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.975390 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-utilities/0.log" Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.078656 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-content/0.log" Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.138866 4778 generic.go:334] "Generic (PLEG): container finished" podID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerID="18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e" exitCode=0 Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.139277 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerDied","Data":"18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e"} Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.139332 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerStarted","Data":"eceaba1cadc94da3645fb8bab4d0ecd5290d1d81ac72c2392e76e15fcc8275b3"} Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.169934 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-utilities/0.log" Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.230867 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-content/0.log" Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.636863 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.692827 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95jdw\" (UniqueName: \"kubernetes.io/projected/894be30f-4dc9-4a4c-b443-2393b89df180-kube-api-access-95jdw\") pod \"894be30f-4dc9-4a4c-b443-2393b89df180\" (UID: \"894be30f-4dc9-4a4c-b443-2393b89df180\") " Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.698966 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/894be30f-4dc9-4a4c-b443-2393b89df180-kube-api-access-95jdw" (OuterVolumeSpecName: "kube-api-access-95jdw") pod "894be30f-4dc9-4a4c-b443-2393b89df180" (UID: "894be30f-4dc9-4a4c-b443-2393b89df180"). InnerVolumeSpecName "kube-api-access-95jdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.799124 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95jdw\" (UniqueName: \"kubernetes.io/projected/894be30f-4dc9-4a4c-b443-2393b89df180-kube-api-access-95jdw\") on node \"crc\" DevicePath \"\"" Mar 18 11:14:05 crc kubenswrapper[4778]: I0318 11:14:05.152934 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" event={"ID":"894be30f-4dc9-4a4c-b443-2393b89df180","Type":"ContainerDied","Data":"552bca3d7cacd76ec3cc9a9d5dae7c12a8765fdd6cc731d4701158a4a76c124e"} Mar 18 11:14:05 crc kubenswrapper[4778]: I0318 11:14:05.153258 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="552bca3d7cacd76ec3cc9a9d5dae7c12a8765fdd6cc731d4701158a4a76c124e" Mar 18 11:14:05 crc kubenswrapper[4778]: I0318 11:14:05.153308 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:05 crc kubenswrapper[4778]: I0318 11:14:05.176837 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/registry-server/0.log" Mar 18 11:14:05 crc kubenswrapper[4778]: I0318 11:14:05.703909 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563868-txd2q"] Mar 18 11:14:05 crc kubenswrapper[4778]: I0318 11:14:05.718042 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563868-txd2q"] Mar 18 11:14:06 crc kubenswrapper[4778]: I0318 11:14:06.163907 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerStarted","Data":"804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182"} Mar 18 11:14:06 crc kubenswrapper[4778]: I0318 11:14:06.197308 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c91868-ef15-4d6d-8547-1b2849d7aa95" path="/var/lib/kubelet/pods/48c91868-ef15-4d6d-8547-1b2849d7aa95/volumes" Mar 18 11:14:09 crc kubenswrapper[4778]: I0318 11:14:09.187454 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:14:09 crc kubenswrapper[4778]: E0318 11:14:09.188325 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:14:11 crc kubenswrapper[4778]: I0318 11:14:11.212334 4778 generic.go:334] "Generic (PLEG): container finished" podID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerID="804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182" exitCode=0 Mar 18 11:14:11 crc kubenswrapper[4778]: I0318 11:14:11.212425 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerDied","Data":"804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182"} Mar 18 11:14:12 crc kubenswrapper[4778]: I0318 11:14:12.230181 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerStarted","Data":"2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84"} Mar 18 11:14:12 crc kubenswrapper[4778]: I0318 11:14:12.288536 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5fvxm" podStartSLOduration=1.753011698 podStartE2EDuration="9.288512747s" podCreationTimestamp="2026-03-18 11:14:03 +0000 UTC" firstStartedPulling="2026-03-18 11:14:04.142568931 +0000 UTC m=+7910.717313771" lastFinishedPulling="2026-03-18 11:14:11.67806998 +0000 UTC m=+7918.252814820" observedRunningTime="2026-03-18 11:14:12.27575107 +0000 UTC m=+7918.850495930" watchObservedRunningTime="2026-03-18 11:14:12.288512747 +0000 UTC m=+7918.863257597" Mar 18 11:14:13 crc kubenswrapper[4778]: I0318 11:14:13.518581 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:13 crc kubenswrapper[4778]: I0318 11:14:13.518906 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:14 crc kubenswrapper[4778]: I0318 11:14:14.567898 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5fvxm" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="registry-server" probeResult="failure" output=< Mar 18 11:14:14 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 11:14:14 crc kubenswrapper[4778]: > Mar 18 11:14:22 crc kubenswrapper[4778]: E0318 11:14:22.667654 4778 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.70:42540->38.102.83.70:35463: write tcp 38.102.83.70:42540->38.102.83.70:35463: write: broken pipe Mar 18 11:14:23 crc kubenswrapper[4778]: I0318 11:14:23.187901 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:14:23 crc kubenswrapper[4778]: E0318 11:14:23.188533 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:14:23 crc kubenswrapper[4778]: I0318 11:14:23.568528 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:23 crc kubenswrapper[4778]: I0318 11:14:23.626502 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:23 crc kubenswrapper[4778]: I0318 11:14:23.812893 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fvxm"] Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.333778 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5fvxm" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="registry-server" containerID="cri-o://2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84" gracePeriod=2 Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.885830 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.965781 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-catalog-content\") pod \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.965905 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f8tm\" (UniqueName: \"kubernetes.io/projected/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-kube-api-access-9f8tm\") pod \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.965978 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-utilities\") pod \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.967456 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-utilities" (OuterVolumeSpecName: "utilities") pod "3c19e2f3-650b-4de7-8f71-f4ea6631c79c" (UID: "3c19e2f3-650b-4de7-8f71-f4ea6631c79c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.972501 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-kube-api-access-9f8tm" (OuterVolumeSpecName: "kube-api-access-9f8tm") pod "3c19e2f3-650b-4de7-8f71-f4ea6631c79c" (UID: "3c19e2f3-650b-4de7-8f71-f4ea6631c79c"). InnerVolumeSpecName "kube-api-access-9f8tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.072826 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f8tm\" (UniqueName: \"kubernetes.io/projected/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-kube-api-access-9f8tm\") on node \"crc\" DevicePath \"\"" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.072859 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.115286 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c19e2f3-650b-4de7-8f71-f4ea6631c79c" (UID: "3c19e2f3-650b-4de7-8f71-f4ea6631c79c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.174690 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.345932 4778 generic.go:334] "Generic (PLEG): container finished" podID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerID="2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84" exitCode=0 Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.345976 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerDied","Data":"2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84"} Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.346001 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerDied","Data":"eceaba1cadc94da3645fb8bab4d0ecd5290d1d81ac72c2392e76e15fcc8275b3"} Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.346020 4778 scope.go:117] "RemoveContainer" containerID="2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.346147 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.374560 4778 scope.go:117] "RemoveContainer" containerID="804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.378489 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fvxm"] Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.390006 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5fvxm"] Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.434921 4778 scope.go:117] "RemoveContainer" containerID="18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.493676 4778 scope.go:117] "RemoveContainer" containerID="2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84" Mar 18 11:14:26 crc kubenswrapper[4778]: E0318 11:14:26.494676 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84\": container with ID starting with 2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84 not found: ID does not exist" containerID="2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.494711 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84"} err="failed to get container status \"2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84\": rpc error: code = NotFound desc = could not find container \"2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84\": container with ID starting with 2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84 not found: ID does not exist" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.494727 4778 scope.go:117] "RemoveContainer" containerID="804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182" Mar 18 11:14:26 crc kubenswrapper[4778]: E0318 11:14:26.496255 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182\": container with ID starting with 804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182 not found: ID does not exist" containerID="804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.496282 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182"} err="failed to get container status \"804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182\": rpc error: code = NotFound desc = could not find container \"804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182\": container with ID starting with 804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182 not found: ID does not exist" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.496296 4778 scope.go:117] "RemoveContainer" containerID="18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e" Mar 18 11:14:26 crc kubenswrapper[4778]: E0318 11:14:26.500321 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e\": container with ID starting with 18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e not found: ID does not exist" containerID="18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.500366 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e"} err="failed to get container status \"18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e\": rpc error: code = NotFound desc = could not find container \"18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e\": container with ID starting with 18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e not found: ID does not exist" Mar 18 11:14:28 crc kubenswrapper[4778]: I0318 11:14:28.200489 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" path="/var/lib/kubelet/pods/3c19e2f3-650b-4de7-8f71-f4ea6631c79c/volumes" Mar 18 11:14:37 crc kubenswrapper[4778]: I0318 11:14:37.187240 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:14:37 crc kubenswrapper[4778]: E0318 11:14:37.188415 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:14:38 crc kubenswrapper[4778]: I0318 11:14:38.837985 4778 scope.go:117] "RemoveContainer" containerID="ecc2f8a6686d5391d07b662f53f7a3bdd9927adf67509a229601871555c0b456" Mar 18 11:14:52 crc kubenswrapper[4778]: I0318 11:14:52.189115 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:14:52 crc kubenswrapper[4778]: E0318 11:14:52.189975 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:14:53 crc kubenswrapper[4778]: E0318 11:14:53.187432 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.171765 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg"] Mar 18 11:15:00 crc kubenswrapper[4778]: E0318 11:15:00.173216 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="registry-server" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.173242 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="registry-server" Mar 18 11:15:00 crc kubenswrapper[4778]: E0318 11:15:00.173300 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="extract-utilities" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.173313 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="extract-utilities" Mar 18 11:15:00 crc kubenswrapper[4778]: E0318 11:15:00.173384 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894be30f-4dc9-4a4c-b443-2393b89df180" containerName="oc" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.173397 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="894be30f-4dc9-4a4c-b443-2393b89df180" containerName="oc" Mar 18 11:15:00 crc kubenswrapper[4778]: E0318 11:15:00.173441 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="extract-content" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.173452 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="extract-content" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.173992 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="registry-server" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.174054 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="894be30f-4dc9-4a4c-b443-2393b89df180" containerName="oc" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.175187 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.178167 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.179022 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.202693 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg"] Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.276279 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-config-volume\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.276336 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5prz\" (UniqueName: \"kubernetes.io/projected/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-kube-api-access-l5prz\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.276930 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-secret-volume\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.379805 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-secret-volume\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.379970 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-config-volume\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.380007 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5prz\" (UniqueName: \"kubernetes.io/projected/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-kube-api-access-l5prz\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.383170 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-config-volume\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.404065 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-secret-volume\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.409678 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5prz\" (UniqueName: \"kubernetes.io/projected/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-kube-api-access-l5prz\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.500372 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.974377 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg"] Mar 18 11:15:00 crc kubenswrapper[4778]: W0318 11:15:00.980812 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ef320b3_caf2_4ff6_aa7e_1a5e059effff.slice/crio-01f33ed17e5733897772695cc06cbddfe938fe839de0ab8a747f8a2f760c3c80 WatchSource:0}: Error finding container 01f33ed17e5733897772695cc06cbddfe938fe839de0ab8a747f8a2f760c3c80: Status 404 returned error can't find the container with id 01f33ed17e5733897772695cc06cbddfe938fe839de0ab8a747f8a2f760c3c80 Mar 18 11:15:01 crc kubenswrapper[4778]: I0318 11:15:01.719657 4778 generic.go:334] "Generic (PLEG): container finished" podID="5ef320b3-caf2-4ff6-aa7e-1a5e059effff" containerID="83ccfb7a3c0a73cfc5ac20a3b0e1058355f9ff7ce52c96b3182b319f43df6e02" exitCode=0 Mar 18 11:15:01 crc kubenswrapper[4778]: I0318 11:15:01.719917 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" event={"ID":"5ef320b3-caf2-4ff6-aa7e-1a5e059effff","Type":"ContainerDied","Data":"83ccfb7a3c0a73cfc5ac20a3b0e1058355f9ff7ce52c96b3182b319f43df6e02"} Mar 18 11:15:01 crc kubenswrapper[4778]: I0318 11:15:01.720110 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" event={"ID":"5ef320b3-caf2-4ff6-aa7e-1a5e059effff","Type":"ContainerStarted","Data":"01f33ed17e5733897772695cc06cbddfe938fe839de0ab8a747f8a2f760c3c80"} Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.145998 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.162949 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-config-volume\") pod \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.163052 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5prz\" (UniqueName: \"kubernetes.io/projected/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-kube-api-access-l5prz\") pod \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.163408 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-secret-volume\") pod \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.167327 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-config-volume" (OuterVolumeSpecName: "config-volume") pod "5ef320b3-caf2-4ff6-aa7e-1a5e059effff" (UID: "5ef320b3-caf2-4ff6-aa7e-1a5e059effff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.171802 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-kube-api-access-l5prz" (OuterVolumeSpecName: "kube-api-access-l5prz") pod "5ef320b3-caf2-4ff6-aa7e-1a5e059effff" (UID: "5ef320b3-caf2-4ff6-aa7e-1a5e059effff"). InnerVolumeSpecName "kube-api-access-l5prz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.182524 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5ef320b3-caf2-4ff6-aa7e-1a5e059effff" (UID: "5ef320b3-caf2-4ff6-aa7e-1a5e059effff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.266834 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.266882 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5prz\" (UniqueName: \"kubernetes.io/projected/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-kube-api-access-l5prz\") on node \"crc\" DevicePath \"\"" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.266897 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.739212 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" event={"ID":"5ef320b3-caf2-4ff6-aa7e-1a5e059effff","Type":"ContainerDied","Data":"01f33ed17e5733897772695cc06cbddfe938fe839de0ab8a747f8a2f760c3c80"} Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.739252 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01f33ed17e5733897772695cc06cbddfe938fe839de0ab8a747f8a2f760c3c80" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.739311 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:04 crc kubenswrapper[4778]: I0318 11:15:04.232280 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m"] Mar 18 11:15:04 crc kubenswrapper[4778]: I0318 11:15:04.243166 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m"] Mar 18 11:15:06 crc kubenswrapper[4778]: I0318 11:15:06.201980 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="688101ed-133b-42c6-87f0-fb2ce2afa33f" path="/var/lib/kubelet/pods/688101ed-133b-42c6-87f0-fb2ce2afa33f/volumes" Mar 18 11:15:07 crc kubenswrapper[4778]: I0318 11:15:07.187064 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:15:07 crc kubenswrapper[4778]: E0318 11:15:07.187584 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:15:22 crc kubenswrapper[4778]: I0318 11:15:22.187534 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:15:22 crc kubenswrapper[4778]: E0318 11:15:22.188436 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:15:37 crc kubenswrapper[4778]: I0318 11:15:37.188784 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:15:37 crc kubenswrapper[4778]: E0318 11:15:37.189714 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:15:38 crc kubenswrapper[4778]: I0318 11:15:38.923320 4778 scope.go:117] "RemoveContainer" containerID="91439ddaf1c7b64a7912887de697803bd3f4ff4a97a1ee187c7b7ad2914b7556" Mar 18 11:15:49 crc kubenswrapper[4778]: I0318 11:15:49.188293 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:15:49 crc kubenswrapper[4778]: E0318 11:15:49.189258 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.181010 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563876-sl728"] Mar 18 11:16:00 crc kubenswrapper[4778]: E0318 11:16:00.182057 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef320b3-caf2-4ff6-aa7e-1a5e059effff" containerName="collect-profiles" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.182074 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef320b3-caf2-4ff6-aa7e-1a5e059effff" containerName="collect-profiles" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.182338 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef320b3-caf2-4ff6-aa7e-1a5e059effff" containerName="collect-profiles" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.183167 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.189093 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.189322 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.189551 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.208055 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563876-sl728"] Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.332687 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2h2d\" (UniqueName: \"kubernetes.io/projected/f1897e5a-c532-4379-9a46-ad5355a45122-kube-api-access-z2h2d\") pod \"auto-csr-approver-29563876-sl728\" (UID: \"f1897e5a-c532-4379-9a46-ad5355a45122\") " pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.434471 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2h2d\" (UniqueName: \"kubernetes.io/projected/f1897e5a-c532-4379-9a46-ad5355a45122-kube-api-access-z2h2d\") pod \"auto-csr-approver-29563876-sl728\" (UID: \"f1897e5a-c532-4379-9a46-ad5355a45122\") " pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.458639 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2h2d\" (UniqueName: \"kubernetes.io/projected/f1897e5a-c532-4379-9a46-ad5355a45122-kube-api-access-z2h2d\") pod \"auto-csr-approver-29563876-sl728\" (UID: \"f1897e5a-c532-4379-9a46-ad5355a45122\") " pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.508142 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.980882 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563876-sl728"] Mar 18 11:16:01 crc kubenswrapper[4778]: I0318 11:16:01.358289 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563876-sl728" event={"ID":"f1897e5a-c532-4379-9a46-ad5355a45122","Type":"ContainerStarted","Data":"9d38eeb743b07219d50246a19bcd998b3678d8cc4be7099190b24f6d4a03bec3"} Mar 18 11:16:02 crc kubenswrapper[4778]: E0318 11:16:02.189117 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:16:03 crc kubenswrapper[4778]: I0318 11:16:03.381349 4778 generic.go:334] "Generic (PLEG): container finished" podID="f1897e5a-c532-4379-9a46-ad5355a45122" containerID="149d8c7a5b7b3e6ba4cd600a7a38eec0eda785a24c394741cb2942187806b242" exitCode=0 Mar 18 11:16:03 crc kubenswrapper[4778]: I0318 11:16:03.381674 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563876-sl728" event={"ID":"f1897e5a-c532-4379-9a46-ad5355a45122","Type":"ContainerDied","Data":"149d8c7a5b7b3e6ba4cd600a7a38eec0eda785a24c394741cb2942187806b242"} Mar 18 11:16:04 crc kubenswrapper[4778]: I0318 11:16:04.198080 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:16:04 crc kubenswrapper[4778]: E0318 11:16:04.198647 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:16:04 crc kubenswrapper[4778]: I0318 11:16:04.739472 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:04 crc kubenswrapper[4778]: I0318 11:16:04.830280 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2h2d\" (UniqueName: \"kubernetes.io/projected/f1897e5a-c532-4379-9a46-ad5355a45122-kube-api-access-z2h2d\") pod \"f1897e5a-c532-4379-9a46-ad5355a45122\" (UID: \"f1897e5a-c532-4379-9a46-ad5355a45122\") " Mar 18 11:16:04 crc kubenswrapper[4778]: I0318 11:16:04.835155 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1897e5a-c532-4379-9a46-ad5355a45122-kube-api-access-z2h2d" (OuterVolumeSpecName: "kube-api-access-z2h2d") pod "f1897e5a-c532-4379-9a46-ad5355a45122" (UID: "f1897e5a-c532-4379-9a46-ad5355a45122"). InnerVolumeSpecName "kube-api-access-z2h2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:16:04 crc kubenswrapper[4778]: I0318 11:16:04.935696 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2h2d\" (UniqueName: \"kubernetes.io/projected/f1897e5a-c532-4379-9a46-ad5355a45122-kube-api-access-z2h2d\") on node \"crc\" DevicePath \"\"" Mar 18 11:16:05 crc kubenswrapper[4778]: I0318 11:16:05.411351 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563876-sl728" event={"ID":"f1897e5a-c532-4379-9a46-ad5355a45122","Type":"ContainerDied","Data":"9d38eeb743b07219d50246a19bcd998b3678d8cc4be7099190b24f6d4a03bec3"} Mar 18 11:16:05 crc kubenswrapper[4778]: I0318 11:16:05.411688 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d38eeb743b07219d50246a19bcd998b3678d8cc4be7099190b24f6d4a03bec3" Mar 18 11:16:05 crc kubenswrapper[4778]: I0318 11:16:05.411529 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:05 crc kubenswrapper[4778]: I0318 11:16:05.842513 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563870-s7lhp"] Mar 18 11:16:05 crc kubenswrapper[4778]: I0318 11:16:05.855633 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563870-s7lhp"] Mar 18 11:16:06 crc kubenswrapper[4778]: I0318 11:16:06.199321 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08809b1c-c749-4734-9fc4-6a0a755aa9cd" path="/var/lib/kubelet/pods/08809b1c-c749-4734-9fc4-6a0a755aa9cd/volumes" Mar 18 11:16:19 crc kubenswrapper[4778]: I0318 11:16:19.188106 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:16:19 crc kubenswrapper[4778]: E0318 11:16:19.189058 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:16:23 crc kubenswrapper[4778]: I0318 11:16:23.596659 4778 generic.go:334] "Generic (PLEG): container finished" podID="339d23a2-4cea-4331-b745-44219b471d41" containerID="376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0" exitCode=0 Mar 18 11:16:23 crc kubenswrapper[4778]: I0318 11:16:23.596796 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/must-gather-8j576" event={"ID":"339d23a2-4cea-4331-b745-44219b471d41","Type":"ContainerDied","Data":"376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0"} Mar 18 11:16:23 crc kubenswrapper[4778]: I0318 11:16:23.598420 4778 scope.go:117] "RemoveContainer" containerID="376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0" Mar 18 11:16:24 crc kubenswrapper[4778]: I0318 11:16:24.187283 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4v7jl_must-gather-8j576_339d23a2-4cea-4331-b745-44219b471d41/gather/0.log" Mar 18 11:16:32 crc kubenswrapper[4778]: I0318 11:16:32.188229 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:16:32 crc kubenswrapper[4778]: E0318 11:16:32.189607 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:16:35 crc kubenswrapper[4778]: I0318 11:16:35.601047 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4v7jl/must-gather-8j576"] Mar 18 11:16:35 crc kubenswrapper[4778]: I0318 11:16:35.601879 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4v7jl/must-gather-8j576" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="copy" containerID="cri-o://07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310" gracePeriod=2 Mar 18 11:16:35 crc kubenswrapper[4778]: I0318 11:16:35.611919 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4v7jl/must-gather-8j576"] Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.028763 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4v7jl_must-gather-8j576_339d23a2-4cea-4331-b745-44219b471d41/copy/0.log" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.029519 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.148764 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjcph\" (UniqueName: \"kubernetes.io/projected/339d23a2-4cea-4331-b745-44219b471d41-kube-api-access-cjcph\") pod \"339d23a2-4cea-4331-b745-44219b471d41\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.149277 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/339d23a2-4cea-4331-b745-44219b471d41-must-gather-output\") pod \"339d23a2-4cea-4331-b745-44219b471d41\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.154712 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/339d23a2-4cea-4331-b745-44219b471d41-kube-api-access-cjcph" (OuterVolumeSpecName: "kube-api-access-cjcph") pod "339d23a2-4cea-4331-b745-44219b471d41" (UID: "339d23a2-4cea-4331-b745-44219b471d41"). InnerVolumeSpecName "kube-api-access-cjcph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.251775 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjcph\" (UniqueName: \"kubernetes.io/projected/339d23a2-4cea-4331-b745-44219b471d41-kube-api-access-cjcph\") on node \"crc\" DevicePath \"\"" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.337110 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/339d23a2-4cea-4331-b745-44219b471d41-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "339d23a2-4cea-4331-b745-44219b471d41" (UID: "339d23a2-4cea-4331-b745-44219b471d41"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.353563 4778 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/339d23a2-4cea-4331-b745-44219b471d41-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.784377 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4v7jl_must-gather-8j576_339d23a2-4cea-4331-b745-44219b471d41/copy/0.log" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.784724 4778 generic.go:334] "Generic (PLEG): container finished" podID="339d23a2-4cea-4331-b745-44219b471d41" containerID="07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310" exitCode=143 Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.784765 4778 scope.go:117] "RemoveContainer" containerID="07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.784878 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.817547 4778 scope.go:117] "RemoveContainer" containerID="376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.885694 4778 scope.go:117] "RemoveContainer" containerID="07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310" Mar 18 11:16:36 crc kubenswrapper[4778]: E0318 11:16:36.886111 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310\": container with ID starting with 07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310 not found: ID does not exist" containerID="07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.886140 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310"} err="failed to get container status \"07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310\": rpc error: code = NotFound desc = could not find container \"07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310\": container with ID starting with 07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310 not found: ID does not exist" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.886160 4778 scope.go:117] "RemoveContainer" containerID="376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0" Mar 18 11:16:36 crc kubenswrapper[4778]: E0318 11:16:36.886695 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0\": container with ID starting with 376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0 not found: ID does not exist" containerID="376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.886722 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0"} err="failed to get container status \"376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0\": rpc error: code = NotFound desc = could not find container \"376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0\": container with ID starting with 376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0 not found: ID does not exist" Mar 18 11:16:38 crc kubenswrapper[4778]: I0318 11:16:38.198858 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="339d23a2-4cea-4331-b745-44219b471d41" path="/var/lib/kubelet/pods/339d23a2-4cea-4331-b745-44219b471d41/volumes" Mar 18 11:16:39 crc kubenswrapper[4778]: I0318 11:16:39.002937 4778 scope.go:117] "RemoveContainer" containerID="b0c173a65daa3d6727011d9a85b81569efd5870086c307d8ad02c5186b648e01" Mar 18 11:16:39 crc kubenswrapper[4778]: I0318 11:16:39.053005 4778 scope.go:117] "RemoveContainer" containerID="7417bcdf486fe4210bba0dca5e997eafe86b7f08ceaf37548fb4760f00212acc" Mar 18 11:16:47 crc kubenswrapper[4778]: I0318 11:16:47.187774 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:16:47 crc kubenswrapper[4778]: E0318 11:16:47.188531 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:17:01 crc kubenswrapper[4778]: I0318 11:17:01.187228 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:17:01 crc kubenswrapper[4778]: E0318 11:17:01.188006 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:17:08 crc kubenswrapper[4778]: E0318 11:17:08.189387 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:17:12 crc kubenswrapper[4778]: I0318 11:17:12.187426 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:17:12 crc kubenswrapper[4778]: E0318 11:17:12.188482 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.868751 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qlv6x"] Mar 18 11:17:19 crc kubenswrapper[4778]: E0318 11:17:19.869730 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="copy" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.869744 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="copy" Mar 18 11:17:19 crc kubenswrapper[4778]: E0318 11:17:19.869771 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="gather" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.869777 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="gather" Mar 18 11:17:19 crc kubenswrapper[4778]: E0318 11:17:19.869793 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1897e5a-c532-4379-9a46-ad5355a45122" containerName="oc" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.869799 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1897e5a-c532-4379-9a46-ad5355a45122" containerName="oc" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.869970 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="gather" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.869983 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="copy" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.869997 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1897e5a-c532-4379-9a46-ad5355a45122" containerName="oc" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.871312 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.882406 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlv6x"] Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.058079 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-catalog-content\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.058308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-utilities\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.058386 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrwdk\" (UniqueName: \"kubernetes.io/projected/05b76f7a-111e-4d55-bf5d-300863cd06d7-kube-api-access-nrwdk\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.160649 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrwdk\" (UniqueName: \"kubernetes.io/projected/05b76f7a-111e-4d55-bf5d-300863cd06d7-kube-api-access-nrwdk\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.160833 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-catalog-content\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.160871 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-utilities\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.161341 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-utilities\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.161910 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-catalog-content\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.196519 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrwdk\" (UniqueName: \"kubernetes.io/projected/05b76f7a-111e-4d55-bf5d-300863cd06d7-kube-api-access-nrwdk\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.200811 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.735995 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlv6x"] Mar 18 11:17:21 crc kubenswrapper[4778]: I0318 11:17:21.253428 4778 generic.go:334] "Generic (PLEG): container finished" podID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerID="fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465" exitCode=0 Mar 18 11:17:21 crc kubenswrapper[4778]: I0318 11:17:21.253535 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlv6x" event={"ID":"05b76f7a-111e-4d55-bf5d-300863cd06d7","Type":"ContainerDied","Data":"fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465"} Mar 18 11:17:21 crc kubenswrapper[4778]: I0318 11:17:21.253739 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlv6x" event={"ID":"05b76f7a-111e-4d55-bf5d-300863cd06d7","Type":"ContainerStarted","Data":"c977c1845e4effe2186b040420fcb1d302a3c18a5cc4ab73442c03de746df12c"} Mar 18 11:17:21 crc kubenswrapper[4778]: I0318 11:17:21.256437 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 11:17:23 crc kubenswrapper[4778]: I0318 11:17:23.281125 4778 generic.go:334] "Generic (PLEG): container finished" podID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerID="85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202" exitCode=0 Mar 18 11:17:23 crc kubenswrapper[4778]: I0318 11:17:23.281183 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlv6x" event={"ID":"05b76f7a-111e-4d55-bf5d-300863cd06d7","Type":"ContainerDied","Data":"85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202"} Mar 18 11:17:24 crc kubenswrapper[4778]: I0318 11:17:24.294031 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlv6x" event={"ID":"05b76f7a-111e-4d55-bf5d-300863cd06d7","Type":"ContainerStarted","Data":"fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e"} Mar 18 11:17:24 crc kubenswrapper[4778]: I0318 11:17:24.314215 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qlv6x" podStartSLOduration=2.626412597 podStartE2EDuration="5.314176213s" podCreationTimestamp="2026-03-18 11:17:19 +0000 UTC" firstStartedPulling="2026-03-18 11:17:21.256153745 +0000 UTC m=+8107.830898595" lastFinishedPulling="2026-03-18 11:17:23.943917331 +0000 UTC m=+8110.518662211" observedRunningTime="2026-03-18 11:17:24.313543076 +0000 UTC m=+8110.888287926" watchObservedRunningTime="2026-03-18 11:17:24.314176213 +0000 UTC m=+8110.888921053" Mar 18 11:17:25 crc kubenswrapper[4778]: I0318 11:17:25.187546 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:17:25 crc kubenswrapper[4778]: E0318 11:17:25.187806 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:17:30 crc kubenswrapper[4778]: I0318 11:17:30.206373 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:30 crc kubenswrapper[4778]: I0318 11:17:30.207024 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:30 crc kubenswrapper[4778]: I0318 11:17:30.287324 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:30 crc kubenswrapper[4778]: I0318 11:17:30.443524 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:30 crc kubenswrapper[4778]: I0318 11:17:30.537006 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlv6x"] Mar 18 11:17:32 crc kubenswrapper[4778]: I0318 11:17:32.380343 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qlv6x" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="registry-server" containerID="cri-o://fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e" gracePeriod=2 Mar 18 11:17:32 crc kubenswrapper[4778]: I0318 11:17:32.883695 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.070689 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-catalog-content\") pod \"05b76f7a-111e-4d55-bf5d-300863cd06d7\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.071311 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-utilities\") pod \"05b76f7a-111e-4d55-bf5d-300863cd06d7\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.071362 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrwdk\" (UniqueName: \"kubernetes.io/projected/05b76f7a-111e-4d55-bf5d-300863cd06d7-kube-api-access-nrwdk\") pod \"05b76f7a-111e-4d55-bf5d-300863cd06d7\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.073004 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-utilities" (OuterVolumeSpecName: "utilities") pod "05b76f7a-111e-4d55-bf5d-300863cd06d7" (UID: "05b76f7a-111e-4d55-bf5d-300863cd06d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.080611 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b76f7a-111e-4d55-bf5d-300863cd06d7-kube-api-access-nrwdk" (OuterVolumeSpecName: "kube-api-access-nrwdk") pod "05b76f7a-111e-4d55-bf5d-300863cd06d7" (UID: "05b76f7a-111e-4d55-bf5d-300863cd06d7"). InnerVolumeSpecName "kube-api-access-nrwdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.107707 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05b76f7a-111e-4d55-bf5d-300863cd06d7" (UID: "05b76f7a-111e-4d55-bf5d-300863cd06d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.174012 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.174055 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.174069 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrwdk\" (UniqueName: \"kubernetes.io/projected/05b76f7a-111e-4d55-bf5d-300863cd06d7-kube-api-access-nrwdk\") on node \"crc\" DevicePath \"\"" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.401127 4778 generic.go:334] "Generic (PLEG): container finished" podID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerID="fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e" exitCode=0 Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.401212 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlv6x" event={"ID":"05b76f7a-111e-4d55-bf5d-300863cd06d7","Type":"ContainerDied","Data":"fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e"} Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.401250 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlv6x" event={"ID":"05b76f7a-111e-4d55-bf5d-300863cd06d7","Type":"ContainerDied","Data":"c977c1845e4effe2186b040420fcb1d302a3c18a5cc4ab73442c03de746df12c"} Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.401263 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.401274 4778 scope.go:117] "RemoveContainer" containerID="fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.437949 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlv6x"] Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.438587 4778 scope.go:117] "RemoveContainer" containerID="85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.446221 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlv6x"] Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.457129 4778 scope.go:117] "RemoveContainer" containerID="fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.494062 4778 scope.go:117] "RemoveContainer" containerID="fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e" Mar 18 11:17:33 crc kubenswrapper[4778]: E0318 11:17:33.494683 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e\": container with ID starting with fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e not found: ID does not exist" containerID="fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.494721 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e"} err="failed to get container status \"fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e\": rpc error: code = NotFound desc = could not find container \"fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e\": container with ID starting with fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e not found: ID does not exist" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.494745 4778 scope.go:117] "RemoveContainer" containerID="85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202" Mar 18 11:17:33 crc kubenswrapper[4778]: E0318 11:17:33.494985 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202\": container with ID starting with 85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202 not found: ID does not exist" containerID="85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.495012 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202"} err="failed to get container status \"85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202\": rpc error: code = NotFound desc = could not find container \"85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202\": container with ID starting with 85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202 not found: ID does not exist" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.495029 4778 scope.go:117] "RemoveContainer" containerID="fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465" Mar 18 11:17:33 crc kubenswrapper[4778]: E0318 11:17:33.495328 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465\": container with ID starting with fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465 not found: ID does not exist" containerID="fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.495352 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465"} err="failed to get container status \"fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465\": rpc error: code = NotFound desc = could not find container \"fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465\": container with ID starting with fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465 not found: ID does not exist" Mar 18 11:17:34 crc kubenswrapper[4778]: I0318 11:17:34.202710 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" path="/var/lib/kubelet/pods/05b76f7a-111e-4d55-bf5d-300863cd06d7/volumes" Mar 18 11:17:39 crc kubenswrapper[4778]: I0318 11:17:39.192400 4778 scope.go:117] "RemoveContainer" containerID="4244424fcad241de0fa7ed0597ece19cff3267ef03d0dbab6082ae06fe156bfa" Mar 18 11:17:40 crc kubenswrapper[4778]: I0318 11:17:40.188081 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:17:40 crc kubenswrapper[4778]: E0318 11:17:40.188856 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:17:52 crc kubenswrapper[4778]: I0318 11:17:52.187449 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:17:52 crc kubenswrapper[4778]: E0318 11:17:52.188244 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.159087 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563878-2kwvn"] Mar 18 11:18:00 crc kubenswrapper[4778]: E0318 11:18:00.159976 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="registry-server" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.159989 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="registry-server" Mar 18 11:18:00 crc kubenswrapper[4778]: E0318 11:18:00.160029 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="extract-utilities" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.160034 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="extract-utilities" Mar 18 11:18:00 crc kubenswrapper[4778]: E0318 11:18:00.160047 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="extract-content" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.160052 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="extract-content" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.160233 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="registry-server" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.160872 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.163718 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.163903 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.164284 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.170236 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563878-2kwvn"] Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.276404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2668\" (UniqueName: \"kubernetes.io/projected/a6b4eda3-c3f7-40e9-8f26-d82054654c49-kube-api-access-q2668\") pod \"auto-csr-approver-29563878-2kwvn\" (UID: \"a6b4eda3-c3f7-40e9-8f26-d82054654c49\") " pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.379002 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2668\" (UniqueName: \"kubernetes.io/projected/a6b4eda3-c3f7-40e9-8f26-d82054654c49-kube-api-access-q2668\") pod \"auto-csr-approver-29563878-2kwvn\" (UID: \"a6b4eda3-c3f7-40e9-8f26-d82054654c49\") " pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.402540 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2668\" (UniqueName: \"kubernetes.io/projected/a6b4eda3-c3f7-40e9-8f26-d82054654c49-kube-api-access-q2668\") pod \"auto-csr-approver-29563878-2kwvn\" (UID: \"a6b4eda3-c3f7-40e9-8f26-d82054654c49\") " pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.502472 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:01 crc kubenswrapper[4778]: I0318 11:18:01.008971 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563878-2kwvn"] Mar 18 11:18:01 crc kubenswrapper[4778]: I0318 11:18:01.692325 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" event={"ID":"a6b4eda3-c3f7-40e9-8f26-d82054654c49","Type":"ContainerStarted","Data":"4cf717326dcf434c1ecca449dd2dce3b09a4c455d622981e4995c6c45b4ee5e4"} Mar 18 11:18:02 crc kubenswrapper[4778]: I0318 11:18:02.704295 4778 generic.go:334] "Generic (PLEG): container finished" podID="a6b4eda3-c3f7-40e9-8f26-d82054654c49" containerID="69d660b1da69b29e026550fe6c9425fe0d4cebfcef6d2a0c10358283a2466818" exitCode=0 Mar 18 11:18:02 crc kubenswrapper[4778]: I0318 11:18:02.704373 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" event={"ID":"a6b4eda3-c3f7-40e9-8f26-d82054654c49","Type":"ContainerDied","Data":"69d660b1da69b29e026550fe6c9425fe0d4cebfcef6d2a0c10358283a2466818"} Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.062093 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.154418 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2668\" (UniqueName: \"kubernetes.io/projected/a6b4eda3-c3f7-40e9-8f26-d82054654c49-kube-api-access-q2668\") pod \"a6b4eda3-c3f7-40e9-8f26-d82054654c49\" (UID: \"a6b4eda3-c3f7-40e9-8f26-d82054654c49\") " Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.175102 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b4eda3-c3f7-40e9-8f26-d82054654c49-kube-api-access-q2668" (OuterVolumeSpecName: "kube-api-access-q2668") pod "a6b4eda3-c3f7-40e9-8f26-d82054654c49" (UID: "a6b4eda3-c3f7-40e9-8f26-d82054654c49"). InnerVolumeSpecName "kube-api-access-q2668". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.193278 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:18:04 crc kubenswrapper[4778]: E0318 11:18:04.193749 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.256911 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2668\" (UniqueName: \"kubernetes.io/projected/a6b4eda3-c3f7-40e9-8f26-d82054654c49-kube-api-access-q2668\") on node \"crc\" DevicePath \"\"" Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.730043 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" event={"ID":"a6b4eda3-c3f7-40e9-8f26-d82054654c49","Type":"ContainerDied","Data":"4cf717326dcf434c1ecca449dd2dce3b09a4c455d622981e4995c6c45b4ee5e4"} Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.730096 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cf717326dcf434c1ecca449dd2dce3b09a4c455d622981e4995c6c45b4ee5e4" Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.730626 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:05 crc kubenswrapper[4778]: I0318 11:18:05.155095 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563872-w9xc8"] Mar 18 11:18:05 crc kubenswrapper[4778]: I0318 11:18:05.167372 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563872-w9xc8"] Mar 18 11:18:06 crc kubenswrapper[4778]: I0318 11:18:06.200631 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2575542-201b-40c8-baec-f64e53f357a6" path="/var/lib/kubelet/pods/c2575542-201b-40c8-baec-f64e53f357a6/volumes" Mar 18 11:18:16 crc kubenswrapper[4778]: I0318 11:18:16.186866 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:18:16 crc kubenswrapper[4778]: E0318 11:18:16.188493 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:18:30 crc kubenswrapper[4778]: I0318 11:18:30.188706 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:18:30 crc kubenswrapper[4778]: I0318 11:18:30.973843 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"6b0ac993b4f24855e183f727279ee7234cc746a4cd9d755c93e21afd7bbad379"} Mar 18 11:18:37 crc kubenswrapper[4778]: E0318 11:18:37.187842 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:18:39 crc kubenswrapper[4778]: I0318 11:18:39.323714 4778 scope.go:117] "RemoveContainer" containerID="9b9c4586ce364f21cf8a583a2a00575bf65854ca35b9f67e292350a899db8fd9" Mar 18 11:19:42 crc kubenswrapper[4778]: E0318 11:19:42.187855 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.149133 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563880-7p9cl"] Mar 18 11:20:00 crc kubenswrapper[4778]: E0318 11:20:00.150723 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b4eda3-c3f7-40e9-8f26-d82054654c49" containerName="oc" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.150750 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b4eda3-c3f7-40e9-8f26-d82054654c49" containerName="oc" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.151136 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b4eda3-c3f7-40e9-8f26-d82054654c49" containerName="oc" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.152344 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.154736 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.155325 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.155433 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.160787 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563880-7p9cl"] Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.348338 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq6dx\" (UniqueName: \"kubernetes.io/projected/d5eeffef-3ac0-4175-a48f-988f221fdb87-kube-api-access-nq6dx\") pod \"auto-csr-approver-29563880-7p9cl\" (UID: \"d5eeffef-3ac0-4175-a48f-988f221fdb87\") " pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.450298 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq6dx\" (UniqueName: \"kubernetes.io/projected/d5eeffef-3ac0-4175-a48f-988f221fdb87-kube-api-access-nq6dx\") pod \"auto-csr-approver-29563880-7p9cl\" (UID: \"d5eeffef-3ac0-4175-a48f-988f221fdb87\") " pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.469286 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq6dx\" (UniqueName: \"kubernetes.io/projected/d5eeffef-3ac0-4175-a48f-988f221fdb87-kube-api-access-nq6dx\") pod \"auto-csr-approver-29563880-7p9cl\" (UID: \"d5eeffef-3ac0-4175-a48f-988f221fdb87\") " pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.491978 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.964091 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563880-7p9cl"] Mar 18 11:20:01 crc kubenswrapper[4778]: I0318 11:20:01.891772 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" event={"ID":"d5eeffef-3ac0-4175-a48f-988f221fdb87","Type":"ContainerStarted","Data":"dcc9fc95bcb5cd51a57987a594274b357968befc77a0e082a721ce5016fb5b87"} Mar 18 11:20:02 crc kubenswrapper[4778]: I0318 11:20:02.903498 4778 generic.go:334] "Generic (PLEG): container finished" podID="d5eeffef-3ac0-4175-a48f-988f221fdb87" containerID="ecc7767e917cec854c2c16c99b0004e4c693491ad73089f93315d55e813c4597" exitCode=0 Mar 18 11:20:02 crc kubenswrapper[4778]: I0318 11:20:02.903584 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" event={"ID":"d5eeffef-3ac0-4175-a48f-988f221fdb87","Type":"ContainerDied","Data":"ecc7767e917cec854c2c16c99b0004e4c693491ad73089f93315d55e813c4597"} Mar 18 11:20:03 crc kubenswrapper[4778]: E0318 11:20:02.999970 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5eeffef_3ac0_4175_a48f_988f221fdb87.slice/crio-ecc7767e917cec854c2c16c99b0004e4c693491ad73089f93315d55e813c4597.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5eeffef_3ac0_4175_a48f_988f221fdb87.slice/crio-conmon-ecc7767e917cec854c2c16c99b0004e4c693491ad73089f93315d55e813c4597.scope\": RecentStats: unable to find data in memory cache]" Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.345214 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.431091 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq6dx\" (UniqueName: \"kubernetes.io/projected/d5eeffef-3ac0-4175-a48f-988f221fdb87-kube-api-access-nq6dx\") pod \"d5eeffef-3ac0-4175-a48f-988f221fdb87\" (UID: \"d5eeffef-3ac0-4175-a48f-988f221fdb87\") " Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.438537 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5eeffef-3ac0-4175-a48f-988f221fdb87-kube-api-access-nq6dx" (OuterVolumeSpecName: "kube-api-access-nq6dx") pod "d5eeffef-3ac0-4175-a48f-988f221fdb87" (UID: "d5eeffef-3ac0-4175-a48f-988f221fdb87"). InnerVolumeSpecName "kube-api-access-nq6dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.533672 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq6dx\" (UniqueName: \"kubernetes.io/projected/d5eeffef-3ac0-4175-a48f-988f221fdb87-kube-api-access-nq6dx\") on node \"crc\" DevicePath \"\"" Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.928018 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" event={"ID":"d5eeffef-3ac0-4175-a48f-988f221fdb87","Type":"ContainerDied","Data":"dcc9fc95bcb5cd51a57987a594274b357968befc77a0e082a721ce5016fb5b87"} Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.928060 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcc9fc95bcb5cd51a57987a594274b357968befc77a0e082a721ce5016fb5b87" Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.928064 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:05 crc kubenswrapper[4778]: I0318 11:20:05.470019 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563874-hj4l4"] Mar 18 11:20:05 crc kubenswrapper[4778]: I0318 11:20:05.486645 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563874-hj4l4"] Mar 18 11:20:06 crc kubenswrapper[4778]: I0318 11:20:06.212866 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="894be30f-4dc9-4a4c-b443-2393b89df180" path="/var/lib/kubelet/pods/894be30f-4dc9-4a4c-b443-2393b89df180/volumes" Mar 18 11:20:30 crc kubenswrapper[4778]: I0318 11:20:30.148021 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:20:30 crc kubenswrapper[4778]: I0318 11:20:30.149105 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:20:39 crc kubenswrapper[4778]: I0318 11:20:39.414039 4778 scope.go:117] "RemoveContainer" containerID="8d49e3ed1fe8885369ea3edca8ac073df8e6a13bd130a54cf891e878c3833fb8" Mar 18 11:21:00 crc kubenswrapper[4778]: I0318 11:21:00.147817 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:21:00 crc kubenswrapper[4778]: I0318 11:21:00.148572 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:21:07 crc kubenswrapper[4778]: E0318 11:21:07.189388 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.147522 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.148581 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.148657 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.149897 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b0ac993b4f24855e183f727279ee7234cc746a4cd9d755c93e21afd7bbad379"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.149961 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://6b0ac993b4f24855e183f727279ee7234cc746a4cd9d755c93e21afd7bbad379" gracePeriod=600 Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.549938 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="6b0ac993b4f24855e183f727279ee7234cc746a4cd9d755c93e21afd7bbad379" exitCode=0 Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.550031 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"6b0ac993b4f24855e183f727279ee7234cc746a4cd9d755c93e21afd7bbad379"} Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.550412 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"ec3dcc697dc177d9f083b60be35d5ffbf66ca818a495e0794a59896ab1952779"} Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.550445 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434"